Thejus A P
Thejus A P

Reputation: 71

Copy Deep Glacier S3 Data From One AWS account to another account

I have 4 S3 Buckets, in which I have almost 40Tbs of data which includes 90% of Deep archive S3 Tiers.

I need to move this data to another AWS Account since am planning to close my account.

NOTE: Both s3 is in the region ap-south-1

Suggest a way to do this easily and cost-friendly and fast, because when I tried with AWS s3 CLI commands, it takes days to complete this much amount of data.

Upvotes: 0

Views: 845

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 270274

Since you are wanting to copy existing S3 objects, you can Perform large-scale batch operations on Amazon S3 objects - Amazon Simple Storage Service.

Your steps would be:

  • Use Amazon S3 Inventory to produce a manifest file listing all existing objects (and edit it if you only wish to copy a subset of objects)
  • Use S3 Batch Operations to temporarily restore the Glacier objects (charges apply) by supplying the manifest file that lists the objects to restore
  • Use S3 Batch Operations to copy the objects to the S3 bucket in the other account

The destination bucket will require a Bucket Policy that permits S3 Batch Operations to write to that bucket.

See: Cross-account bulk transfer of files using Amazon S3 Batch Operations | AWS Storage Blog

Data Transfer charges will not apply since the buckets are in the same Region. However, costs will be incurred for restoring the objects stored in the Glacier Storage Class. Charges will also apply for reading the source objects and writing the destination objects. These charges will be based on the number of API requests, so it will be more expensive if you have lots of small objects.

Upvotes: 1

Related Questions