Reputation: 199
I am trying to download a large number of files from a webpage (which contains only a image, so I can use a simple wget), but want to speed it up using GNU Parallel. Can anyone please help me parallelize this for loop? Thanks.
for i in `seq 1 1000`
do
wget -O "$i.jpg" www.somewebsite.com/webpage
done
Upvotes: 2
Views: 2806
Reputation: 33327
You could do it like this:
seq 1 1000 | parallel wget www.somewebsite.com/webpage/{}.jpg
You can also use the -P
option to specify the number of jobs you want to run concurrently.
Also you may decide to use curl instead like:
parallel -P 1000 curl -o {}.jpg www.somewebsite.com/webpage/{}.jpg ::: {1..1000}
Upvotes: 4