Arland
Arland

Reputation: 67

using wget to download a directory

I'm trying to download all the files in an online directory. The command I'm using is:

wget -r -np -nH -R index.html http://www.oecd-nea.org/dbforms/data/eva/evatapes/mendl_2/

Using this command I get an empty directory. If I specify file names at the end I can get one at a time, but I'd like to get them all at once. Am I just missing something simple?

output from command:

--2015-03-14 14:54:05-- http://www.oecd-nea.org/dbforms/data/evaevatapes/mendl_2/ Resolving www.oecd-nea.org... 193.51.64.80 Connecting to www.oecd-nea.org|193.51.64.80|:80... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] Saving to: âdbforms/data/eva/evatapes/mendl_2/index.htmlâdbforms/data/eva/evatapes/mendl_2/index.htmlârobots.txtârobots.txt

Upvotes: 0

Views: 7590

Answers (1)

Marcus Müller
Marcus Müller

Reputation: 36337

Add the depth of links you want to follow (-l1, since you only want to follow one link):

wget -e robots=off -l1 -r -np -nH -R index.html http://www.oecd-nea.org/dbforms/data/eva/evatapes/mendl_2/

I also added -e robots=off, since there is a robots.txt which would normally stop wget from going through that directory. For the rest of the world:

  • -r recursive,
  • -np no parent directory
  • -nH no spanning across hosts

Upvotes: 3

Related Questions