Reputation: 395
I would like to copy multiple files simultaneously to speed up my process I currently used the follow
scp -r [email protected]:/var/www/example/example.example.com .
but it only copies one file at a time. I have a 100 Mbps fibre so I have the bandwidth available to really copy a lot at the same time, please help.
Upvotes: 8
Views: 65484
Reputation: 2209
You can use background task with wait command. Wait command ensures that all the background tasks are completed before processing next line. i.e echo will be executed after scp for all three nodes are completed.
#!/bin/bash
scp -i anuruddha.pem myfile1.tar [email protected]:/tmp &
scp -i anuruddha.pem myfile2.tar [email protected]:/tmp &
scp -i anuruddha.pem myfile.tar [email protected]:/tmp &
wait
echo "SCP completed"
Upvotes: 9
Reputation: 91
You can use parallel-scp (AKA pscp): http://manpages.ubuntu.com/manpages/natty/man1/parallel-scp.1.html
With this tool, you can copy a file (or files) to multiple hosts simultaneously.
Regards,
Upvotes: 3
Reputation: 628
I am not sure if this helps you, but I generally archive (compression is not required. just archiving is sufficient) file at the source, download it, extract them. This will speed up the process significantly. Before archiving it took > 8 hours to download 1GB After archiving it took < 8 minutes to do the same
Upvotes: 4
Reputation: 2111
SSH is able to do so-called "multiplexing" - more connections over one (to one server). It can be one way to afford what you want. Look up keywords like "ControlMaster"
Second way is using more connections, then send every job at background:
for file in file1 file2 file3 ; do
scp $file server:/tmp/ &
done
But, this is answer to your question - "How to copy multiple files simultaneously". For speed up, you can use weaker encryption (rc4 etc) and also don't forget, that the bottleneck can be your hard drive - because SCP don't implicitly limit transfer speed.
Last thing is using rsync - in some cases, it can be lot faster than scp...
Upvotes: 6
Reputation: 111
If you specify multiple files scp will download them sequentially:
scp -r [email protected]:/var/www/example/file1 [email protected]:/var/www/example/file2 .
Alternatively, if you want the files to be downloaded in parallel, then use multiple invocations of scp, putting each in the background.
#! /usr/bin/env bash
scp [email protected]:/var/www/example/file1 . &
scp [email protected]:/var/www/example/file2 . &
Upvotes: 0
Reputation: 328594
100mbit Ethernet is pretty slow, actually. You can expect 8 MiB/s in theory. In practice, you usually get between 4-6 MiB/s at best.
That said, you won't see a speed increase if you run multiple sessions in parallel. You can try it yourself, simply run two parallel SCP sessions copying two large files. My guess is that you won't see a noticeable speedup. The reasons for this are:
Solutions:
rsync
(which uses SSH under the hood) to copy on the files which have changed since the last time you ran the command.The last suggestion can be done like this:
ssh root@xxx "cd /var/www/example ; tar cf - example.example.com" > example.com.tar
or with compression:
ssh root@xxx "cd /var/www/example ; tar czf - example.example.com" > example.com.tar.gz
Note: bzip2
compresses better but slower. That's why I use gzip
(z
) for tasks like this.
Upvotes: 4