Reputation: 1
I need to Download and archive about 50 subsites (including all working links within the subsite) that were created as part of my company's main portal. I need wget to download the subsites without downloading the entire site.
from the bit of searching I've done this is what I've tried so far
wget --mirror --page-requisites --convert-links --recursive --adjust-extension --compression=auto --reject-regex "/search|/rss" --no-if-modified-since --no-check-certificate --user=xxxxxxx --password=xxxxxxx
this instead downloaded the home page of every subsite without any of the actual links working.
Upvotes: 0
Views: 740
Reputation: 921
You should add --no-parent
to restrict to the part you want.
An example line would be wget --mirror --convert-links --page-requisites ----no-parent -P /path/to/download https://example-domain.com
.
Upvotes: 2