Reputation: 1585
I'm downloading a website with wget. the command is as below :
wget -nc --recursive --page-requisites --html-extension --convert-links --restrict-file-names=windows --domain any-domain.com --no-parent http://any-domain.com/any-page.html
does -nc option skip downloading existed files even when we download a website recursively? It seems -nc option not works.
Upvotes: 2
Views: 7289
Reputation: 1
The option --convert-links
seams to conflict with -nc
. Try removing it.
Upvotes: 0
Reputation: 5165
Yes, the -nc
option will prevent re-download of the file.
The manual page is confusing because it describes all of the related options together.
Here is pertinent bits from the man page:
When running Wget with -r or -p, but without -N or -nc, re-downloading a file will result in the new copy simply overwriting the old. Adding -nc will prevent this behavior, instead causing the original version to be preserved and any newer copies on the server to be ignored.
Upvotes: 0
Reputation: 354
the man say that :
-nc --no-clobber If a file is downloaded more than once in the same directory, Wget's behavior depends on a few options, including -nc. In certain cases, the local file will be clobbered, or overwritten, upon repeated download.
Here is more details (from the man too) :
When running Wget with -r or -p, but without -N, -nd, or -nc, re-downloading a file will result in the new copy simply overwriting the old.
Upvotes: 4