Reputation: 91
I've been using this to wget a large amount of URLs from a list I have stored in a text file:
wget -i websites.txt
However, it appears to me that downloads one file at a time and then moves on to the next. I may be wrong about that, if so, please feel free to let me know.
But what if I wanted it to download 10 or 20 files simultaneously? Is that possible to do with a simple wget command or is it going to require something more elaborate?
By the way, these are all extremely small files (~80kb) being downloaded. It just seems to take forever when downloading millions...
Upvotes: 2
Views: 2351
Reputation: 12255
You can use the parallel command:
parallel -a websites.txt --jobs 10 wget
with -a
to read each line from the file, --jobs
to say how many to
run in parallel, and wget
will be suffixed by the next line from the file.
Upvotes: 7