As the sync command just got a -J option I have started to use threads to transfer new files to multiple remotes in parallel. A question that arises is if I still can assume that only one instance of the special remote will be started. I tend to store quite some state in the special remote program that I use to connect to a cloud server, and running two instances in parallel might run in to all sorts of fun behavior.
That said, I am of course aware that if I run multiple git annex commands I would run into these problems.
Well, I can confirm that only one instance of any external special remote will be run per git-annex process currently.
Actually, it's a bit more than that -- If multiple threads try to use the same external special remote at the same time, everything is necessarily serialized, and so using -J won't let multiple downloads happen at once from a single external special remote, although it may still usefuly parallelize amoung several remotes.
Since that seems like something worth dealing with, perhaps by having a pool of external special remote processes, I don't feel comfortable making any promises about the behavior. And as you note, it's easy to get multiple procesess by running multiple git-annex commands, so that is something an external special remote needs to deal with anyway.
Thanks for a clear answer. It seems that I am still in the sweet spot for using -J4. I.e transport everything to multiple remotes in parallel.
I do however take it as a recommendation to be careful, and will give it another consideration to be careful in case of doing more development work and releasing it.