cnotethegr8
cnotethegr8

Reputation: 7510

Download files with filename from list of urls using wget

I have a file which has a list or links.

http://example.com/00001
http://example.com/00002
http://example.com/00003
http://example.com/00004

When manually going to one of these links in a browser, a file will be automatically downloaded. The file downloaded has a unique name for that file, one which is different from the url path.

I'm aware of using wget -i file-of-links.txt, but when downloaded, each file will be titled based on the url, not based on the file name.

Is there a command which will allow me to download the links from a given file and save the downloaded files with the file name and not the url name?

Upvotes: 1

Views: 1447

Answers (1)

Sharad
Sharad

Reputation: 2023

Yes. You can do this with --trust-server-names and --content-disposition option to wget.

--trust-server-names

If this is set to on, on a redirect the last component of the redirection URL will be used as the local file name. By default it is used the last component in the original URL

--content-disposition

if this is set to on, experimental (not fully-functional) support for Content-Disposition headers is enabled. This can currently result in extra round-trips to the server for a HEAD request, and is known to suffer from a few bugs, which is why it is not currently enabled by default.

This option is useful for some file-downloading CGI programs that use Content-Disposition headers to describe what the name of a downloaded file should be.

Upvotes: 1

Related Questions