david_adler
david_adler

Reputation: 10902

How to download a full web page (css, js and images included) and all linked web pages

This command gets all the files that are necessary to properly display a given html page.

wget --page-requisites http://example.com/your/page.html

I want to loop through for all the links on that page ie the a href's and apply the same command (or similar, doesn't have to be bash) to them.

Upvotes: 3

Views: 2820

Answers (1)

Jakub Kotowski
Jakub Kotowski

Reputation: 7571

wget -r -l 2 --page-requisites http://example.com/your/page.html

See man wget

Recursive Retrieval Options

   -r

   --recursive
       Turn on recursive retrieving.    The default maximum depth is 5.

   -l depth
   --level=depth
       Specify recursion maximum depth level depth.

Upvotes: 4

Related Questions