16

I am downloading data spread among multiple files that don't change once published.

Therefore --timestamping is not good enough, because it is constantly checking whatever resource is changed, which is, in my case, completely pointless.

--no-clobber would fit perfectly. Unfortunately it doesn't work with --convert-links for some reason.

Both --no-clobber and --convert-links were specified, only --convert-links will be used.

I hoped that --backup-converted would help but it changed nothing (it works for --timestamping.)

Why wget --convert-links --backup-converted --no-clobber --wait 1 https://example.com ignores --no-clobber and how could it be fixed?

2 Answers2

1

Any possibility of using rsync? You will have to have ssh access to the system.  (I have never seen rsync used for http:// downloads.)

Something like this will get files you have never gotten and will re-pickup files that stopped downloading for some reason.

rsync -avzheP ssh [email protected]:/remotefolder/ /localfolder/

This may not work for you. Not enough information about what you are trying to do.

Mark Stewart
  • 726
  • 3
  • 8
-2

You cannot use both --convert-links and --no-clobber. You will get this message:

Both --no-clobber and --convert-links were specified, only --convert-links will be used.

wget can accomplish what you want, if you specify --convert-links and --timestamping. But this will only work if the target site supplies the file timestamp info in the response headers.

RashaMatt
  • 137
  • 5
  • This answer has nothing that was not mentioned in the very first revision (https://unix.stackexchange.com/posts/177330/revisions) of my question. – reducing activity Jul 09 '17 at 09:16