PurifiedDrinkingWater
PurifiedDrinkingWater

Reputation: 21

Wget Macro for downloading multiple URLS?

(NOTE: You need at least 10 reputation to post more than 2 links. I had to remove the http and urls, but it's still understandable i hope!)

Hello!

I am trying to Wget an entire website for personal educational use. Here's what the URL looks like:

example.com/download.php?id=1

i want to download all the pages from 1 to the last page which is 4952

so the first URL is:

example.com/download.php?id=1

and the second is

example.com/download.php?id=4952

What would be the most efficient method to download the pages from 1 - 4952?

My current command is (it's working perfectly fine, the exact way i want it to):

wget -P /home/user/wget -S -nd --reject=.rar http://example.com/download.php?id=1

NOTE: The website has a troll and if you try to run the following command:

 wget -P /home/user/wget -S -nd --reject=.rar --recursive --no-clobber --domains=example.com --no-parent http://example.com/download.php

it will download a 1000GB .rar file just to troll you!!!

Im new to linux, please be nice! just trying to learn!

Thank you!

Upvotes: 1

Views: 277

Answers (1)

PurifiedDrinkingWater
PurifiedDrinkingWater

Reputation: 21

Notepadd++ =

your URL + Column Editor = Massive list of all urls

Wget -I your_file_with_all_urls = Success!

thanks to Barmar

Upvotes: 1

Related Questions