Hello lad
Hello lad

Reputation: 18790

Access Denied using boto3 through aws Lambda

I use the data processing pipeline constructed of

S3 + SNS + Lambda

becasue S3 can not send notificaiton out of its storage region so I made use of SNS to send S3 notification to Lambda in other region.

The lambda function coded with

from __future__ import print_function
import boto3


def lambda_handler (event, context):
    input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
    input_file_key = event["Records"][0]["s3"]["object"]["key"]

    input_file_name = input_file_bucket+"/"+input_file_key

    s3=boto3.resource("s3")
    obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
    response = obj.get()

    return event #echo first key valuesdf

when I ran save and test, I got the following error

    {
  "stackTrace": [
    [
      "/var/task/lambda_function.py",
      20,
      "lambda_handler",
      "response = obj.get()"
    ],
    [
      "/var/runtime/boto3/resources/factory.py",
      394,
      "do_action",
      "response = action(self, *args, **kwargs)"
    ],
    [
      "/var/runtime/boto3/resources/action.py",
      77,
      "__call__",
      "response = getattr(parent.meta.client, operation_name)(**params)"
    ],
    [
      "/var/runtime/botocore/client.py",
      310,
      "_api_call",
      "return self._make_api_call(operation_name, kwargs)"
    ],
    [
      "/var/runtime/botocore/client.py",
      395,
      "_make_api_call",
      "raise ClientError(parsed_response, operation_name)"
    ]
  ],
  "errorType": "ClientError",
  "errorMessage": "An error occurred (AccessDenied) when calling the GetObject operation: Access Denied"
}

I configured the lambda Role with

full S3 access

and set bucket policy on my target bucket

everyone can do anything(list, delete, etc.)

It seems that I haven't set policy well.

Upvotes: 27

Views: 90838

Answers (7)

Vivek Puurkayastha
Vivek Puurkayastha

Reputation: 536

Following is a simple JSON IAM snippet tha could be added to the IAM Role. Although this gives FullAccess to the bucket, generally should not be a problem because dedicated buckets are used for Lambda operations in production.

But if FullAccess is not desired than whatever operations are throwing AccessDenied Error, those will need to be added separately.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowAllOnThisBucket",
            "Effect": "Allow",
            "Action": [
                "s3:*"
            ],
            "Resource": "arn:aws:s3:::bucket_name/*"
        }
    ]
}

Upvotes: 1

omuthu
omuthu

Reputation: 6333

Possibility of the specific S3 object which you are looking for is having limited permissions

  1. S3 object level permission for read is denied
  2. The role attached to lambda does not have permission to get/read S3 objects
  3. If access granted using S3 bucket policy, verify read permissions are provided

Upvotes: 10

Tal Joffe
Tal Joffe

Reputation: 5828

Adding to Amri's answer, if your bucket is private and you have the credentials to access it you can use the boto3.client:

import boto3
s3 = boto3.client('s3',aws_access_key_id='ACCESS_KEY',aws_secret_access_key='SECRET_KEY')
response = s3.get_object(Bucket='BUCKET', Key='KEY')

*For this file: s3://bucket/a/b/c/some.text, Bucket is 'bucket' and Key is 'a/b/c/some.text'

---EDIT---

You can easily change the script to accept keys as environment variables for instance so they are not hardcoded. I left it like this for simplicity

Upvotes: 7

VIPIN KUMAR
VIPIN KUMAR

Reputation: 3127

In my case - the Lambda which I was running had a role blahblahRole and this blahblahRole didn't have the permission on S3 bucket.

Upvotes: 1

Coffee
Coffee

Reputation: 1771

I had similar problem, the difference was the bucket was encrypted in KMS key.

Fixed with: IAM -> Encryption keys -> YOUR_AWS_KMS_KEY -> to your policy or account

Upvotes: 7

Rob Rose
Rob Rose

Reputation: 1962

Omuthu's answer actually correctly identified my problem, but it didn't provide a solution so I thought I'd do that.

It's possible that when you setup your permissions in IAM you made something like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        }
    ]
}

Unfortunately, that's not correct. You need to apply the Object permissions to the objects in the bucket. So it has to look like this:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::test"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::test/*"
            ]
        }
    ]
}

Note the second ARN witht the /* at the end of it.

Upvotes: 32

Amri
Amri

Reputation: 1100

I had a similar problem, I solved it by attaching the appropriate policy to my user.

IAM -> Users -> Username -> Permissions -> Attach policy.

Also make sure you add the correct access key and secret access key, you can do so using AmazonCLI.

Upvotes: 17

Related Questions