Dillon Cortez
Dillon Cortez

Reputation: 176

how do I download a large number of zip files with wget to a url

At the url here there is a large number of zip files that I need to download and save to the test/files/downloads directory. I'm using wget with the prompt

wget -i http://bitly.com/nuvi-plz -P test/files/downloads

and It downloads the whole page into a file inside the directory and starts downloading each zip file but then gives me a 404 for each file that looks something like

2016-05-12 17:12:28--  http://bitly.com/1462835080018.zip
Connecting to bitly.com|69.58.188.33|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://bitly.com/1462835080018.zip [following]
--2016-05-12 17:12:28--  https://bitly.com/1462835080018.zip
Connecting to bitly.com|69.58.188.33|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2016-05-12 17:12:29 ERROR 404: Not Found.

How can I get wget to download all the zip files on the page properly?

Upvotes: 0

Views: 2603

Answers (1)

Jared Rummler
Jared Rummler

Reputation: 38121

You need to get the redirect from bit.ly and then download all files. This is real ugly, but it worked:

wget http://bitly.com/nuvi-plz --server-response -O /dev/null 2>&1 | \
  awk '(NR==1){SRC=$3;} /^  Location: /{DEST=$2} END{ print SRC, DEST}' | sed 's|.*http|http|' | \
while read url; do 
  wget -A zip -r -l 1 -nd $url -P test/files/downloads
done

If you use the direct link, this will work:

wget -A zip -r -l 1 -nd http://feed.omgili.com/5Rh5AMTrc4Pv/mainstream/posts/ -P test/files/downloads

Upvotes: 2

Related Questions