Reputation: 349
I have a text file containing a bunch of webpages:
http://rest.kegg.jp/link/pathway/7603
http://rest.kegg.jp/link/pathway/5620
…
My aim is to download all info on these pages to a single text file.
The following works perfectly but it gives me 3000+ text files, how could I simple merge all the output files during the loop.
while read i; do wget $i; done < urls.txt
Thanks a lot
Upvotes: 0
Views: 1142
Reputation: 26667
Use -O file
option, which appends the output to the logfile specified.
while read i; do wget -O outputFile $i; done < urls.txt
The outputFile
will contain the contents as well
Also you can skip the while
loop by specifying the input file using -i file
wget -O outpuFile -i url.txt
Upvotes: 1