Reputation: 45
I have found a website that hosts a few files that I'm after, there's too many to download them all individually. The filenames take a fairly standard and reproducable form i.e. 1_a, 1_b, 1_c etc
Is there a way, in the linux command line, to use wget to automate downloading them all? I can easily put the filenames in a 1 entry per line text file and direct the command line to look up from there, but it wouldn't be the whole url, just the bit that changes so the command would need to look something like:
wget url.com/files/(bit from file).doc sourcefile.txt
and basically be able to substitute in an entry from the sourcefile to the bit in the brackets.
Also, at one stage a large chunk (a few hundred) of the files are simply sequentially numbered, so could I use a for loop for that bit? If so, how would I do this syntactically in the command line?
Many thanks.
Upvotes: 1
Views: 1417
Reputation: 1808
using bash/argument generator:
$ echo wget http://example.com/files_{1,2}_{a..d}
wget http://example.com/files_1_a http://example.com/files_1_b http://example.com/files_1_c http://example.com/files_1_d http://example.com/files_2_a http://example.com/files_2_b http://example.com/files_2_c http://example.com/files_2_d
Upvotes: 0
Reputation: 56059
You can use brace expansion:
wget x.com/files/1_{a..z}.doc
You can combine it if needed:
wget x.com/files/{1..10}_{a..z}.doc
Upvotes: 3
Reputation: 780787
while read fn
do
wget "url.com/files/$fn.doc"
done < sourcefile.txt
Upvotes: 0