mousesports
mousesports

Reputation: 499

Command to Download Remote URL Folder Contents

I was wondering if there was a command to download the contents of a remote folder, i.e all the files contained within that specific folder.

For instance, if we take the URL http://plugins.svn.wordpress.org/hello-dolly/trunk/ - How would it be possible to download the two files contained within the trunk onto my local machine without having to download each file manually?

Also, if there is a way to download all contents including both files AND any listed subdirectories that would be great.

Upvotes: 2

Views: 3586

Answers (1)

kev
kev

Reputation: 161814

If you ever need to download an entire Web site, perhaps for off-line viewing, wget can do the job.
For example:

$ wget \
     --recursive \
     --no-clobber \
     --page-requisites \
     --html-extension \
     --convert-links \
     --restrict-file-names=windows \
     --domains wordpress.org \
     --no-parent \
         http://plugins.svn.wordpress.org/hello-dolly/trunk/

This command downloads the Web site http://plugins.svn.wordpress.org/hello-dolly/trunk/

The options are:

--recursive: download the entire Web site.
--domains wordpress.org: don't follow links outside wordpress.org.
--no-parent: don't follow links outside the directory tutorials/html/.
--page-requisites: get all the elements that compose the page (images, CSS and so on).
--html-extension: save files with the .html extension.
--convert-links: convert links so that they work locally, off-line.
--restrict-file-names=windows: modify filenames so that they will work in Windows as well.
--no-clobber: don't overwrite any existing files (used in case the download is interrupted and resumed).

Upvotes: 1

Related Questions