Recent comments posted to this site:

It's fine to annex the big files and store the small files in git in the usual way.

The find | xargs approach should work.

You can also use the git-annex-matching-options, eg:

git annex add --include='*.adi'

Or:

git annex add --largerthan=1mb

You can also configure git-annex to know which files you consider large, so that git annex add will annex the large ones and add the rest to git not the annex. See largefiles

Comment by joey Sat Jul 30 16:36:51 2016

If you're finding vicfg confusing, you don't have to use it; the same configuration can be done at the command line using git annex wanted. See preferred content for documentation.

If you just want to have full control over what files are stored in the local repository, the easiest way to do that is to run git annex get and git annex drop manually when you want or don't want a file's content, and don't pass --content to git annex sync.

Comment by joey Sat Jul 30 16:30:58 2016

The disk-free-space and mountpoints dependencies will need to be backported to Debian jessie before git-annex's backport can be updated. Both packages are available in testing, so should be fairly easy to backport.

Comment by joey Sat Jul 30 16:15:57 2016
Hey, thank you very much! This is exactly what I was looking for. Unfortunately, FreeBSD sports version 5.20150727 where the --known option is not yet available, but that's a whole different problem that can be easily solved on a saturday.
Comment by squid Fri Jul 29 22:27:51 2016

I'm not sure about creating a branch, but I use https://git-annex.branchable.com/git-annex-find/ to list locally available files.

Good luck!

Comment by jd.schroeder Fri Jul 29 19:12:24 2016
Comment by jd.schroeder Fri Jul 29 19:09:44 2016
This was concerning enough to me that I wound up switching to Syncthing for this particular use case.
Comment by jgoerzen Mon Jul 25 19:09:09 2016
Anyone got any ideas on how to fix this, or move forward without starting again and trashing the gcrypt repos? Bit of a serious bug this, if it can't be repaired. I was using git-annex for backups, but finding your backups are corrupt would be a bit of a nightmare!
Comment by pot Mon Jul 25 18:23:13 2016

--batch mode should be usable to get current metadata, set new metadata, and remove existing metadata. The non-batch metadata command has different syntaxes for all of these, but it would be good to have a single interface that handles all three in batch mode.

It could read a line containing the file or key, with any metadata fields that should be changed:

{"file":"foo"}
{"file":"foo","author":["bar"]}
{"key":"SHA...","author":[]}

And reply with all the metadata, in nearly the same format:

{"file":"foo","key":"SHA...","author":["bar"],lastchanged:["date"],"success":true}

And that reply could in turn be edited and fed back in to change the metadata.


There's a DRY problem here because there's the current JSON generator code, and I'd have to add an Aeson parser to parse the JSON input. But, Aeson parsers also automatically have a matching generator, which is guaranteed to generate code that the parser can parse.

So, it would be nice to use the Aeson JSON generator, instead of the current one, but that can only be done if the JSON is formatted the same, or close enough that nothing currently consuming metadata --json will break.

Comment by joey Mon Jul 25 17:39:37 2016
comment 1 RichiH
Ah, saw this too late. Also see 2aa0841940d309b858eed5bc156262a7d90c949b
Wed Jul 20 05:48:15 2016