Boto3 not assuming IAM role from credentials where aws-cli does without problem

I am setting up some file transfer scripts and am using boto3 to do this.

I need to send some files from local to a third party AWS account (cross-account). I have a role set-up on the other account with permissions to write to the bucket, and assigned this role to a user on my account.

I am able to do this no problem on CLI, but Boto keeps on kicking out an AccessDenied error for the bucket.

I have read through the boto3 docs on this area such as they are here, and have set-up the credential and config files as they are supposed to be (assume they are correct as the CLI approach works), but I am unable to get this working.

Credential File:-

[myuser]
aws_access_key_id = XXXXXXXXXXXXXXXXXXXXXX
aws_secret_access_key = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Config File:-

[profile crossaccount]
region = eu-west-2
source_profile=myuser
role_arn = arn:aws:iam::0123456789:role/crossaccountrole

and here is the code I am trying to get working with this:-

    #set-up variables
    bucket_name = 'otheraccountbucket'
    file_name = 'C:\\Users\\test\\testfile.csv'
    object_name = 'testfile.csv'

    #create a boto session with profile name for assume role call to be made with correct credentials
    session = boto3.Session(profile_name='crossaccount')
    #Create s3_client from that profile based session
    s3_client = session.client('s3')

    #try and upload the file
    response = s3_client.upload_file(
                    file_name, bucket, object_name,
                    ExtraArgs={'ACL': 'bucket-owner-full-control'}
                )

EDIT: in response to John's multi-part permission comment, I have tried to upload via put_object method to bypass this - but still getting AccessDenied, but now on the PutObject permission - which I have confirmed is in place:-

    #set-up variables
    bucket_name = 'otheraccountbucket'
    file_name = 'C:\\Users\\test\\testfile.csv'
    object_name = 'testfile.csv'

    #create a boto session with profile name for assume role call to be made with correct credentials
    session = boto3.Session(profile_name='crossaccount')
    #Create s3_client from that profile based session
    s3_client = session.client('s3')

    #try and upload the file
    with open(file_name, 'rb') as fd:
            response = s3_client.put_object(
                                ACL='bucket-owner-full-control',
                                Body=fd,
                                Bucket=bucket,
                                ContentType='text/csv',
                                Key=object_name
                            )

Crossaccountrole has PutObject permissions - error is :-

An error occurred (AccessDenied) when calling the PutObject operation: Access Denied

END EDIT

Here is the working aws-cli command:-

aws s3 cp "C:\Users\test\testfile.csv" s3://otheraccountbucket --profile crossaccount

I am expecting this to upload correctly as the equivalent cli code does, but instead I get an S3UploadFailedError exception - An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied

Any Help would be much appreciated

Upvotes: 1

Views: 3639

Answers (1)

kyldu
kyldu

Reputation: 187

I had this same problem, my issue ended up being the fact that I had AWS CLI configured with different credentials than my python app where I was trying to use Boto3 to upload files into an s3 bucket.

Here's what worked for me, this only applies to people that have AWS CLI installed:

  1. Open your command line or terminal
  2. Type aws configure
  3. Enter the ID & Secret key of the IAM user you are using for your python boto3 app when prompted
  4. Run your python app and test boto3, you should no longer get the access denied message

Upvotes: 1

Related Questions