Sigfredo Soto-Diaz
Sigfredo Soto-Diaz

Reputation: 11

Moving files to and from an Amazon S3 bucket key using Python

I do not have access to the root bucket but I do have access to a key (KEY NAME) within the bucket.

Example: I cannot access 'BUCKET NAME' but I can access 'BUCKET NAME/KEY NAME'

I have been trying to move files within 'KEY NAME'. In the code below, what I've managed to get working is list_objects_v2.

upload_file gives me the following error:

An error occurred (AccessDenied) when calling the PutObject operation: Access Denied

download_file gives me the following error:

PermissionError: [WinError 5] Access is denied: 'C/Users/username/Desktop'

I'm very new to the AWS environment. What can I do on my end to fully get the access I need?

import logging
import sys
import boto3
import boto
import boto.s3.connection
from botocore.exceptions import ClientError
from boto3.session import Session


def main():

    arguments = len(sys.argv) - 1

    if arguments < 1:
        print("You must supply a folder name")
        return

    bucket_name = 'BUCKET NAME'
    key_name = 'KEY NAME'
    folder = sys.argv[1]


    s3 = boto3.client('s3')
    objects = s3.list_objects_v2(Bucket = bucket_name,
                                 Prefix = key_name + '/' + folder + '/',
                                 Delimiter = '/')
    i = 1

    #
    # Print the bucket's objects within 'KEY NAME'
    #
    if objects is not None:
        # List the object names
        logging.info('Objects in {bucket_name}')
        print("Length of Objects: " + str(len(objects)))
        for obj in objects:
            print("......\n")
            print(i)
            print("....\n")
            print(obj)
            print("..\n")
            print(objects[obj])
            i += 1
    else:
        # Didn't get any keys
        logging.info('No objects in {bucket_name}')

    #
    # Test to see if we can isolate a folder within 'KEY NAME'
    #
    print("\n")
    print("Common Prefixes" + str(objects['CommonPrefixes']) + "\n")
    keys = objects['CommonPrefixes']
    print ("Object 0" + str(keys[0]) + '\n')

    s3 = boto3.resource('s3')
    s3.meta.client.upload_file('C:/Users/username/Desktop/Test/Test.txt',
                               bucket_name,
                               key_name)
    # s3.meta.client.download_file(bucket_name,
    #                              key_name + '/' + folder + '/' + 'Test.txt',
    #                              'C:/Users/username/Desktop')

if __name__ == '__main__':
    main()

Upvotes: 0

Views: 537

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 269530

The most important part is to ensure that you have been given adequate permissions to upload/download/list the prefix.

Here is an example policy that grants access to a prefix of special/:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowUserToSeeBucketListInTheConsole",
            "Action": [
                "s3:ListAllMyBuckets",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::*"
            ]
        },
        {
            "Sid": "AllowListingOfPrefix",
            "Action": [
                "s3:ListBucket"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::my-bucket"
            ],
            "Condition": {
                "StringEquals": {
                    "s3:prefix": [
                        "special/"
                    ],
                    "s3:delimiter": [
                        "/"
                    ]
                }
            }
        },
        {
            "Sid": "UploadDownload",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject"
            ],
            "Resource": "arn:aws:s3:::my-bucket/special/*"
        }
    ]
}

Then, you can run code like this:

import boto3

s3_client = boto3.client('s3')

# Upload a file to S3
s3_client.upload_file('/tmp/hello.txt', 'my-bucket', 'special/hello.txt')

# Download an object
s3_client.download_file('my-bucket', 'special/hello.txt', '/tmp/hello2.txt')

# List objects using Client method
response = s3_client.list_objects_v2(Bucket='my-bucket',Delimiter='/',Prefix='special/')
for object in response['Contents']:
  print(object['Key'], object['Size'])

# List objects using Resource method
s3_resource = boto3.resource('s3')
bucket = s3_resource.Bucket('my-bucket')

for object in bucket.objects.filter(Delimiter='/',Prefix='special/'):
  print(object.key, object.size)

Upvotes: 1

Related Questions