KT.
KT.

Reputation: 11430

Fastest way to download large files from AWS EC2 EBS

Suppose I have a couple of terabytes worth of data files that have accumulated on an EC2 instance's block storage.

What would be the most efficient way of downloading them to a local machine? scp? ftp? nfs? http? rsync? Going through an intermediate s3 bucket? Torrent via multiple machines? Any special tools or scripts out there for this particular problem?

Upvotes: 2

Views: 8171

Answers (3)

KT.
KT.

Reputation: 11430

As I did not really receive a convincing answer, I decided to make a small measurement myself. Here are the results I got:

enter image description here

More details here.

Upvotes: 5

rock3t
rock3t

Reputation: 2233

Please follow these rules:

  • Move as one file, tar everything into a single archive file.
  • Create S3 bucket in the same region as your EC2/EBS.
  • Use AWS CLI S3 command to upload file to S3 bucket.
  • Use AWS CLI to pull the file to your local or wherever another storage is.

This will be the easiest and most efficient way for you.

Upvotes: 5

Janusz
Janusz

Reputation: 1473

Some more info about this usecase is needed. I hope below concepts are helpfull:

  • HTTP - fast, easy to implement, versatile and has small overhead.
  • Resilio (formerly BitTorrent Sync) - fast, easy to deploy, decentralized, and secure. Can handle transfer interruptions. Works if both endpoints are behind NAT.
  • rsync - old school and well known solution. Can resume transfer and fast in syncing big amounts of data.
  • Upload to S3 and get from there - Upload to S3 is fast. Next You can use HTTP(S) or BitTorrent to get data localy.

Upvotes: 2

Related Questions