meme
meme

Reputation: 12227

Site Performance and Download

I am wanting to find an automated way to download an entire website page (not the entire site just a single page) and all elements on the page, then sum the size of these files.

When I say files, I would like to know the total size of HTML, CSS, Images, local and remote JS files, and any CSS background images. Basically the entire page-weight for a given page.

I thought about using CURL but was not sure how to enable it to grab remote and local JS files as well as images referenced in the CSS files.

Upvotes: 0

Views: 86

Answers (1)

Benjamin Bannier
Benjamin Bannier

Reputation: 58774

Try wget:

  • make it download all required files with -p or --page-requisites option
  • download scripts and images local to the site and not further than 2 hops away (this should get local images and code) with -l 2 for --level=2
  • and change the code files to link to your local files instead of their original path with -k for --convert-links:
    wget -p -l 2 -k http://full_url/to/page.html

Upvotes: 1

Related Questions