Luke Richards
Luke Richards

Reputation: 45

Using wget and a list of partial URLs

I have found a website that hosts a few files that I'm after, there's too many to download them all individually. The filenames take a fairly standard and reproducable form i.e. 1_a, 1_b, 1_c etc

Is there a way, in the linux command line, to use wget to automate downloading them all? I can easily put the filenames in a 1 entry per line text file and direct the command line to look up from there, but it wouldn't be the whole url, just the bit that changes so the command would need to look something like:

wget url.com/files/(bit from file).doc sourcefile.txt

and basically be able to substitute in an entry from the sourcefile to the bit in the brackets.

Also, at one stage a large chunk (a few hundred) of the files are simply sequentially numbered, so could I use a for loop for that bit? If so, how would I do this syntactically in the command line?

Many thanks.

Upvotes: 1

Views: 1417

Answers (3)

Kevin
Kevin

Reputation: 56059

You can use brace expansion:

wget x.com/files/1_{a..z}.doc

You can combine it if needed:

wget x.com/files/{1..10}_{a..z}.doc

Upvotes: 3

Barmar
Barmar

Reputation: 780787

while read fn
do
    wget "url.com/files/$fn.doc"
done < sourcefile.txt

Upvotes: 0

Related Questions