Reputation: 1508
I am using AWS SageMaker. I already used it before and I had no problems reading data from an S3 bucket. So, I set up a new notebook instance and id this:
from sagemaker import get_execution_role
role = get_execution_role()
bucket='my-bucket'
data_key = 'myfile.csv'
data_location = 's3://{}/{}'.format(bucket, data_key)
df = pd.read_csv(data_location)
What I got is this:
PermissionError: Access Denied
Note: I checked the IAM Roles and also the policies and it seems to me that I have all the necessary rights to access the S3 bucket (AmazonS3FullAccess etc. are granted). What is different from the situation before is that my data is encrypted. Is there something I have to set up besides the roles?
Edit:
The role I use consist of three policies. These are
and an Execution Role where I added kms:encrypt and kms:decrypt. It looks like this one:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "xyz",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject",
"kms:Encrypt",
"kms:Decrypt"
],
"Resource": "arn:aws:s3:::*"
}
]
}
Is there something missing?
Upvotes: 1
Views: 3306
Reputation: 57184
You need to add (or modify) an IAM policy to grant access to the key the bucket uses for its encryption:
{
"Sid": "KMSAccess",
"Action": [
"kms:Decrypt"
],
"Effect": "Allow",
"Resource": "arn:aws:kms:example-region-1:123456789098:key/111aa2bb-333c-4d44-5555-a111bb2c33dd"
}
Alternatively you can change the key policy of the KMS key directly to grant the sagemaker role access directly. https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-access-default-encryption/
Upvotes: 2