Reputation: 43153
Pretty basic question but I haven't been able to find an answer. Using Transit I can "move" files from one S3 bucket on one AWS account to another S3 bucket on another AWS account, but what it actually does is download the files from the first then upload them to the second.
Is there a way to move files directly from one S3 account to another without downloading them in between?
Upvotes: 92
Views: 90056
Reputation: 865
The given answers so far all require an account that has access to both the source and target s3 buckets. I've found myself recently in a situation where this was not allowed (for various non-technical company reasons that we'll just assume were good).
The solution I ended up going with was to:
s3fs
somewhere (/mnt/target)aws sync s3://source_bucket/folder /mnt/target/folder ...
(Or mv
or cp
as needed)This is the easiest way I've seen to copy between folders when it's not allowed to have a single IAM role with permission to both, and when it's prohibitive to use an intermediate location.
Upvotes: 0
Reputation: 587
Yes, you can transfer the whole s3 bucket from your root account to another AWS root account.
I have tried the given options but they didn't work for me, even I explore solutions from blogs, but that also didn't work for me. So I started exploring properties and the permission tab in the s3 bucket.
And at last, I find one solution which is very easy to achieve and we do not need to create any IAM role or any policy. Just follow the given steps.
Prerequisites:
Steps:
aws s3 cp --recursive s3://source-bucket s3://destination-bucket --source-region source-region --region destination-region --acl bucket-owner-full-control
This command will do copy and paste operation but if you want to move then you can use mv
instead of cp
in above command
Here you can replace source-bucket with your actual bucket name from where you want to copy and replace destination-bucket with your actual bucket name where you want to copy.
You can also specify source and destination region name
You can use your machine to do this or you can spin up one ec2 instance and transfer your s3 data.
Upvotes: 1
Reputation: 1945
One can so it with running following :
aws s3 mv (sync for keeping buckets in sync) s3://source-bucket s3://destination-bucket --recursive
Attach a bucket policy to the source bucket in Source Account.
Attach an AWS Identity and Access Management (IAM) policy to a user or role in Destination Account.
Use the IAM user or role in Destination Account to perform the cross-account move.
Upvotes: 0
Reputation: 16166
Let's consider there are two accounts source account and destination account. And two buckets source-bucket
and destination bucket
. We want to move all files from source-bucket
to destination-bucket
. We can do it by the following steps:
aws configure
aws s3 ls s3://source-bucket/
aws s3 cp s3://source-bucket s3://destination-bucket --recursive
aws s3 mv s3://source-bucket s3://destination-bucket --recursive
Alternative you can use the sync command
- aws s3 sync s3://source-bucket s3://detination-bucket
For Better Explanation follow the link
Upvotes: 4
Reputation: 642
For newly created files (NOT existing objects), you can take advantage of new functionality from AWS. It is Cross-Region Replication (under "Versioning" for the S3 bucket). You can create a policy that will allow you to replicate new objects to a bucket in a different account.
For existing objects, you will still need to copy your objects using another method - unless AWS introduces native functionality for this in the future.
Upvotes: 1
Reputation: 12814
Use the aws cli (I used ubuntu 14 ec2 instance) and just run the following command:
aws s3 sync s3://bucket1 s3://bucket2
You will need to specify the account details for one, and have public write access or public read access to the other.
This will sync the two buckets. You can use the same command again later to sync quickly. Best part is that it doesn't seem t require any bandwidth (e.g. files are not passing through local computer).
Upvotes: 60
Reputation: 341
On Mac OS X I used the Transmit app from Panic. I opened one window for each S3 account (using the API Keys and secrets). I could then drag from one bucket in one window to another bucket in the other window. No need to download files locally first.
Andrew is correct, Transmit downloads the files locally then uploads the files.
Upvotes: 3
Reputation: 411
CrossFTP can copy S3 files straight from one bucket to another without downloading them. It is a GUI S3 client that works on Windows, Mac, and Linux.
Upvotes: 2
Reputation: 2074
Yes, there is a way. And its pretty simple, though it's hard to find it. 8)
For example, suppose your first account username is [email protected] and second is [email protected].
Open AWS Management Console as acc1. Get to the Amazon S3 bucket properties, and in the "Permissions" tab click "Add more permissions". Then add List and View Permissions for "Authenticated Users".
Next, in AWS IAM (it's accessible from among the console tabs) of acc2 create a user with full access to the S3 bucket (to be more secure, you can set up exact permissions, but I prefer to create a temporary user for the transfer and then delete it).
Then you can use s3cmd (using the credentials of the newly created user in acc2) to do something like:
s3cmd cp s3://acc1_bucket/folder/ s3://acc2_bucket/folder --recursive
All transfer will be done on Amazon's side.
Upvotes: 125
Reputation: 63586
boto works well. See this thread. Using boto, you copy objects straight from one bucket to another, rather than downloading them to the local machine and uploading them to another bucket.
Upvotes: 6
Reputation: 18832
If you are just looking for a ready made solution there are a few solutions out there that can do this. Bucket Explorer works on Mac and Windows and can copy across accounts as can Cloudberry S3 Explorer and S3 Browser but they are Windows only so may not work for you.
I suspect the AWS console could also do it with the appropriate permissions setup but I haven't tested this.
You can also do it using the AWS API as long as you have given the AWS account you are using write permissions to the destination bucket.
Upvotes: 22