Reputation: 25604
I am using python library smart_open
to upload file (it would be big files) from python script to S3 bucket
Bucket has policy enforcing SSE with KMS
{
"Version": "2012-10-17",
"Id": "PutObjPolicy",
"Statement": [
{
"Sid": "RequireKMSEncryption",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::n-test-kms-123456789/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "aws:kms"
}
}
}
]
}
I try to open file to write by using:
from smart_open import open
--------
with open(
's3://' + BUCKET_NAME + '/robots.txt',
'w',
transport_params = {
'multipart_upload_kwargs': {
'ServerSideEncryption': 'aws:kms',
'SSEKMSKeyId': 'arn:aws:kms:us-east-2:1234567890:key/86fb3bf7-e9ef-4a93-bc64-35dcf1ca3c8d'
},
'client': boto3.client('s3')
}
) as json_file:
I keep having error:
ValueError: the bucket 'n-test-kms-123456789' does not exist, or is forbidden for access (ClientError('An error occurred (AccessDenied) when calling the CreateMultipartUpload operation: Access Denied'))
User and its IAM role has full permissions to that S3 bucket (including CreateMultipartUpload
) - it seems like whole problem is limited to properly passing 'ServerSideEncryption': 'aws:kms'
to transport_params
What I ma doing wrong?
Upvotes: 2
Views: 845
Reputation: 25604
with open(
's3://' + BUCKET_NAME + '/robots.txt',
'w',
transport_params = {
'client_kwargs': {
'S3.Client.create_multipart_upload': {
'ServerSideEncryption': 'aws:kms'
}
},
'client': boto3.client('s3')
}
) as json_file:
I find proper settings of transport_params
to pass SSE - also it was no need pass KMSKeyId
in favor of default aws/kms/s3
key
Upvotes: 2