If you are doing something to a bunch of files, perhaps a shell script like:
for i in *jpg; do something $i somethingelse; done
and want a simple way to parallelize this, try xargs, which I love:
ls *jpg | xargs -n1 -P5 -I% something % somethingelse
will run on five files at a time. I've used this trick two or three times today in re-synchronizing photos from two phones and a camera and an SD card and Dropbox. #xargs #shell #protip
@22 the find command is also super handy for this since it will recurse through directories
@fenwick67 find can parallelize `-exec` commands for me too? Rad, thanks!
It's a tad bit more involved when the filenames have spaces: one trick is to replace the newline with the null character using tr and tell xargs to expect null-terminated rows—it then ignores spaces, etc. A real example:
cat bad.txt | tr '\n' '\0' | xargs -0 -P8 -n1 -I% dropbox_uploader.sh delete "/Camera Uploads/%"
I have a list of bad filenames in bad.txt, and the superb Dropbox-Uploader https://github.com/andreafabrizi/Dropbox-Uploader will delete them eight at a time. #tr #protip