Reputation: 71
I have 4 S3 Buckets, in which I have almost 40Tbs of data which includes 90% of Deep archive S3 Tiers.
I need to move this data to another AWS Account since am planning to close my account.
NOTE: Both s3 is in the region ap-south-1
Suggest a way to do this easily and cost-friendly and fast, because when I tried with AWS s3 CLI commands, it takes days to complete this much amount of data.
Upvotes: 0
Views: 845
Reputation: 270274
Since you are wanting to copy existing S3 objects, you can Perform large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service.
Your steps would be:
The destination bucket will require a Bucket Policy that permits S3 Batch Operations to write to that bucket.
See: Cross-account bulk transfer of files using Amazon S3 Batch Operations | AWS Storage Blog
Data Transfer charges will not apply since the buckets are in the same Region. However, costs will be incurred for restoring the objects stored in the Glacier Storage Class. Charges will also apply for reading the source objects and writing the destination objects. These charges will be based on the number of API requests, so it will be more expensive if you have lots of small objects.
Upvotes: 1