AdrianPop
AdrianPop

Reputation: 108

Using wget to save a single page

I want to save a single URL into a browsable html. eg. I want to save the first page of digg.com or yahoo.com, but so far, wget goes through all links and saves a lot more pages than necessary.

I want wget to save only digg's homepage, without any external links, exactly like google's cache: http://webcache.googleusercontent.com/search?q=cache%3Adigg.com&oq=cache%3Adigg.com&aqs=chrome..69i57j69i58.4306j0j1&sourceid=chrome&ie=UTF-8

Upvotes: 0

Views: 990

Answers (1)

Karl Adler
Karl Adler

Reputation: 16836

wget --adjust-extension --span-hosts --convert-links --backup-converted --page-requisites [url]

from: https://superuser.com/questions/55040/save-a-single-web-page-with-background-images-with-wget

to specifiy the link level depth use --level=depth

Upvotes: 1

Related Questions