James
James

Reputation: 97

Download multiple urls using wget in windows?

I am trying to download hundreds of files, and I am a windows guy. I searched online and found the solution, but get new questions.

Here is what I did:

  1. I put all the urls into a text file, each url a seperate line. The file is called download.txt

  2. In command window, type in

    wget -i download.txt

  3. I am successful in getting the files.

However, the server looks not very stable, and sometimes I got

Error 500: Internal server error

Then I have to pick out the files that are not downloaded. It is tedious work since the file names are very similar and there are hundreds of them.

My question: Is there any easy way to automatically pick these files out and download them again? Or is there any way to let wget download it again whenever it fails for a file?

Thanks for your help.

Upvotes: 0

Views: 2778

Answers (1)

Paul
Paul

Reputation: 2710

500 Internal Server Error
A generic error message, given when an unexpected condition was encountered and no more specific message is suitable.

Try to download smoothly. These settings will help a bit you not get ban from the website ^^.

wget -b -q -nc -c -N --limit-rate=150k -i download.txt

-b,  --background       go to background after startup
-q,  --quiet            quiet (no output).
-nc, --no-clobber       skip downloads that would download to existing files.
-c,  --continue         resume getting a partially-downloaded file.
-N,  --timestamping     don't re-retrieve files unless newer than local.
     --limit-rate=RATE  limit download rate to RATE.

Upvotes: 1

Related Questions