user3188922
user3188922

Reputation: 349

Open multiple webpages, use wget and merge the output

I have a text file containing a bunch of webpages:

http://rest.kegg.jp/link/pathway/7603
http://rest.kegg.jp/link/pathway/5620
…

My aim is to download all info on these pages to a single text file.

The following works perfectly but it gives me 3000+ text files, how could I simple merge all the output files during the loop.

while read i; do wget $i; done < urls.txt

Thanks a lot

Upvotes: 0

Views: 1142

Answers (1)

nu11p01n73R
nu11p01n73R

Reputation: 26667

Use -O file option, which appends the output to the logfile specified.

while read i; do wget -O outputFile $i; done < urls.txt

The outputFile will contain the contents as well

Also you can skip the while loop by specifying the input file using -i file

wget -O outpuFile -i url.txt

Upvotes: 1

Related Questions