Reputation: 11
I am trying :
1. wget -i url.txt
and
2. wget -O output.ext
How do I join both? Download urls listed in url.txt and save them with the names I specify, as seperate files.
Upvotes: 0
Views: 1137
Reputation: 1719
Define all the URLs in url.txt and give this a try to see if this is what you need:
for url in $(cat url.txt); do wget $url -O $url.out ; done
If your URLs consist of one or more URIs, this would replace slash with underscore:
for url in $(cat url.txt); do wget $url -O $(echo $url | sed "s/\//_/g").out ; done
Upvotes: 1
Reputation: 1836
In this situation, i think, you need two files with the same number of lines, to map each url with a corresponding name:
url.txt
(source file containing your urls, example content given here):
https://svn.apache.org/repos/asf/click/trunk/examples/click-spring-cayenne/README.txt
https://svn.apache.org/repos/asf/click/trunk/examples/click-spring-cayenne/README.txt
output_names.txt
(filenames you want to assign):
readme1.txt
readme2.txt
Then you iterate over both files and pass the contents to wget
, e.g. with the following script
:
#!/bin/bash
IFS=$'\n' read -d '' -r -a url < "$1"
IFS=$'\n' read -d '' -r -a output < "$2"
len=${#url[@]}
for ((i=0;i<$len;i++))
do
wget "${url[$i]}" -O "${output[$i]}"
done
Call:
./script url.txt output_names.txt
Upvotes: 1