code-8
code-8

Reputation: 58692

How can I download all the files from a remote directory to my local directory?

I want to download all the files in a specific directory of my site.

Let's say I have 3 files in my remote SFTP directory

www.site.com/files/phone/2017-09-19-20-39-15

My goal is to create a local folder on my desktop with ONLY those downloaded files. No parents files or parents directory needed. I am trying to get the clean report.

I've tried

wget -m --no-parent -l1 -nH -P ~/Desktop/phone/ www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off

I got

enter image description here

I want to get

enter image description here

How do I tweak my wget command to get something like that?

Should I use anything else other than wget ?

Upvotes: 0

Views: 1754

Answers (1)

Technophobe01
Technophobe01

Reputation: 8676

Ihue,

Taking a shell programatic perspective I would recommend you try the following command line script, note I also added the citation so you can see the original threads.

wget -r -P ~/Desktop/phone/ -A txt www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off

-r enables recursive retrieval. See Recursive Download for more information.

-P sets the directory prefix where all files and directories are saved to.

-A sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list. See Types of Files for more information.

Ref: @don-joey https://askubuntu.com/questions/373047/i-used-wget-to-download-html-files-where-are-the-images-in-the-file-stored

Upvotes: 3

Related Questions