basic
basic

Reputation: 23

recursively downloading files from webpage

http://examples.oreilly.com/9780735615366/

I actually want to be able to have all these files in my disk.

as u can see there are many folders each with different type of files.

and u cannot download "the-folder" directly...only individual files

~

is there any way to automate process..?

I will need regular expressions on urls to arrange them in a "folder" like hierarchy.

what do I use...some scripting language like python?

Upvotes: 2

Views: 3133

Answers (4)

Toucan
Toucan

Reputation: 19

wget (GNU command line tool) will do this for you. The documentation for what you want to do is here: http://www.gnu.org/software/wget/manual/html_node/Recursive-Retrieval-Options.html

Upvotes: 1

Basic
Basic

Reputation: 26766

A cheating answer is to use FTP:

ftp://examples.oreilly.com/pub/examples/9780735615366/

is for the example you gave...

Upvotes: 0

ruslik
ruslik

Reputation: 14900

Try Wget. it's a simple command line utility able to do that.

Upvotes: 0

user405725
user405725

Reputation:

Take a look at wget tool. It can do exactly what you want.

Upvotes: 4

Related Questions