All the target files have been deleted. Of course, when I try to run any deletes again, the files aren't there to delete. Sorry to take your time.
I'm using bash on cygwin.
I have the output of fdupes in a file. I'm grepping the output to exclude a directory I want to keep intact, and wanting to delete the rest of the files listed.
I have some entries with spaces:
./NewVolume/keep/2009/conference/conference/conference 004.jpg
Which trips up xargs:
$ cat real-dupes.txt |xargs rm {}
...
rm: cannot remove ‘2009/conference/conference/conference’: No such file or directory`
When I try the -0 switch, it looks like the lines get globbed together:
$ cat real-dupes.txt |xargs -0 rm
xargs: argument line too long
Other questions have answers where the asker is adviced to use find to feed the arguments into xargs. That's not helpful in my scenario, because I don't believe that I can easily use find to identify the duplicates I want to get rid of. Also, the fdupes job ran some 12+ hours, so I really want to use this data set.
As far as I know, fdupes cannot exclude a directory from its automated delete, so I can't use it out of the box, either.