Copy files between s3 Buckets with destination bucket having a directory

I have two s3 buckets. I want to copy a file from Bucket from to Bucket to. In Bucket to, I have a folder /copy_here. When I tried to directly copy the contents between the buckets using Boto it works. But when I want to copy this to a directory in the destination, I am getting ParameterValidationError. Here is the code which I tried to execute:

def copyToBucket(fromBucket, toBucket, fileName):
    copySource = {
        'Bucket': readBucketName,
        'Key': fileName
    }
    uploadBucket = s3.Bucket(uploadBucketName)
    uploadBucket.copy(copySource, fileName) 

I looked upon s3 meta client and that offers the same functionality as well. I am not sure if this operation is possible in a single step. If there is any, that would be great. If not, should I copy initially to the destination bucket and move my file to the copy_here folder? Currently, my uploadBucketName has been set to to/copy_here.

Upvotes: 0

Views: 819

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 270154

Folders and directories do not exist in Amazon S3. Instead, the filename (Key) of an object consists of the full path plus filename. (Well, the S3 management console makes it appear that there are folders, but they don't really exist.)

So, this would copy a file as you desire:

import boto3

copySource = {
    'Bucket': 'source-bucket-name',
    'Key': 'foo.txt'
}
uploadBucket = s3.Bucket('destination-bucket-name')
uploadBucket.copy(copySource, 'copy_here/foo.txt') 

Note that the destination Key consists of the path + filename. The bucket name must consist only of the bucket name itself.

Upvotes: 1

Related Questions