Reputation: 39
I am trying to transfer files from one folder to another in the same s3 bucket but I am getting error. I have written code below to transfer files to one folder to another.
import boto
from boto.s3.connection import S3Connection
import boto.s3
import sys
from boto.s3.key import Key
conn = S3Connection('access key', 'secret key')
bucket = conn.get_bucket('bucket-name')
for file in bucket.list("2/", "/"):
k = Key(bucket)
print(k)
k.key = '3'
k.set_contents_from_filename(file)
And I am getting the following error:
TypeError Traceback (most recent call last)
<ipython-input-34-4f81a952b1f1> in <module>()
15 print(k)
16 k.key = '3'
---> 17 k.set_contents_from_filename(file)
18
19
/Users/anaconda/lib/python3.6/site-packages/boto/s3/key.py in set_contents_from_filename(self, filename, headers, replace, cb, num_cb, policy, md5, reduced_redundancy, encrypt_key)
1356 :return: The number of bytes written to the key.
1357 """
-> 1358 with open(filename, 'rb') as fp:
1359 return self.set_contents_from_file(fp, headers, replace, cb,
1360 num_cb, policy, md5,
TypeError: expected str, bytes or os.PathLike object, not Key
Is there any efficient way to transfer files from one folder to another in the same s3 bucket using python boto/boto3. I am testing this on small number of files. Actually I have 60gb data which I have to transfer in batches of 1000 files.
Anyone could help me with this?
Thanks
Upvotes: 1
Views: 7692
Reputation: 21
I recommend using boto3 for this purpose. You can easily copy s3 objects from one folder to another.
import boto3
s3 = boto3.resource('s3')
copy_source = {
'Bucket': 'mybucket',
'Key': 'mykey'
}
s3.meta.client.copy(copy_source, 'otherbucket', 'otherkey')
You can use above code in loop, and replace they key with your keys. After copying the objects you can use same key to delete objects from the source folder.
Upvotes: 1