danramosd.com
danramosd.com

Reputation: 15

Using command line to save large amount of HTML files

I Have website I need to save a large amount of pages from. The pages are in incremental order, index.php?id=1, index.php?id=2.... Is there a shell script(using mac) I could run to loop through all of these pages and save them individually into a directory?

Upvotes: 1

Views: 160

Answers (2)

Tomasz Nurkiewicz
Tomasz Nurkiewicz

Reputation: 340708

In bash:

for i in {1..100}; do wget http://www.example.com/index.php?id=${i}; done

Upvotes: 2

Dennis Williamson
Dennis Williamson

Reputation: 359875

#!/bin/bash
url='http://example.com/index.php?='
dir='path/to/dir'
filename=file
extension=ext

for i in {1..100}
do
    wget "$url$i" -O "$dir$filename$i.$ext"
done

Upvotes: 1

Related Questions