Boolean
Boolean

Reputation: 14660

Alternative to scp, transferring files between linux machines by opening parallel connections

Is there an alternative to scp, to transfer a large file from one machine to another machine by opening parallel connections and also able to pause and resume the download.

Please don't transfer this to severfault.com. I am not a system administrator. I am a developer trying to transfer past database dumps between backup hosts and servers.

Thank you

Upvotes: 8

Views: 33571

Answers (4)

danw
danw

Reputation: 1558

Similar to Mike K's answer, check out https://code.google.com/p/scp-tsunami/ - it handles splitting the file, starting several scp processes to copy the parts and then joins them again...it can also copy to multiple hosts...

 ./scpTsunami.py -v -s -t 9 -b 10m -u dan bigfile.tar.gz /tmp -l remote.host

That splits the file into 10MB chunks and copies them using 9 scp processes...

Upvotes: 4

Anonymous
Anonymous

Reputation: 31

The program you are after is lftp. It supports sftp and parallel transfers using its pget command. It is available under Ubuntu (sudo apt-get install lftp) and you can read a review of it here:

http://www.cyberciti.biz/tips/linux-unix-download-accelerator.html

Upvotes: 3

MikeK
MikeK

Reputation: 702

You could try using split(1) to break the file apart and then scp the pieces in parallel. The file could then be combined into a single file on the destination machine with 'cat'.

# on local host
split -b 1M large.file large.file. # split into 1MiB chunks
for f in large.file.*; do scp $f remote_host: & done

# on remote host
cat large.file.* > large.file

Upvotes: 11

Dennis Williamson
Dennis Williamson

Reputation: 360683

Take a look at rsync to see if it will meet your needs.

The correct placement of questions is not based on your role, but on the type of question. Since this one is not strictly programming related it is likely that it will be migrated.

Upvotes: 4

Related Questions