leela prasad
leela prasad

Reputation: 31

AWS lambda_handler error for set_contents_from_string to upload in S3

Recently started working on python scripting for encrypting the data and uploading to S3 using aws lambda_handler function. From local machine to S3 it was executing fine (note: Opens all permissions for anyone from bucket side) when the same script executing from aws Lambda_handler (note: Opens all permissions for anyone from bucket side) getting the below error.

{
  "stackTrace": [
    [
      "/var/task/enc.py",
      62,
      "lambda_handler",
      "up_key = up_bucket.new_key('enc.txt').set_contents_from_string(buf.readline(),replace=True,policy='public-read',encrypt_key=False)"
    ],
    [
      "/var/task/boto/s3/key.py",
      1426,
      "set_contents_from_string",
      "encrypt_key=encrypt_key)"
    ],
    [
      "/var/task/boto/s3/key.py",
      1293,
      "set_contents_from_file",
      "chunked_transfer=chunked_transfer, size=size)"
    ],
    [
      "/var/task/boto/s3/key.py",
      750,
      "send_file",
      "chunked_transfer=chunked_transfer, size=size)"
    ],
    [
      "/var/task/boto/s3/key.py",
      951,
      "_send_file_internal",
      "query_args=query_args"
    ],
    [
      "/var/task/boto/s3/connection.py",
      668,
      "make_request",
      "retry_handler=retry_handler"
    ],
    [
      "/var/task/boto/connection.py",
      1071,
      "make_request",
      "retry_handler=retry_handler)"
    ],
    [
      "/var/task/boto/connection.py",
      940,
      "_mexe",
      "request.body, request.headers)"
    ],
    [
      "/var/task/boto/s3/key.py",
      884,
      "sender",
      "response.status, response.reason, body)"
    ]
  ],
  "errorType": "S3ResponseError",
  "errorMessage": "S3ResponseError: 403 Forbidden\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>4B09C24C4D79C147</RequestId><HostId>CzhDhtYDERh9E/e4tVHek35G3CEMh0qFifcnd06fKN/oyLHtj9bWg87zZOajBNQDfqIC2QrldsA=</HostId></Error>"
}

Here is the script i am executing

def lambda_handler(event, context):

    cipher = AESCipher(key='abcd')
    print "ready to connect S3"
    conn = boto.connect_s3()
    print "connected to download"
    bucket = conn.get_bucket('s3download')
    key = bucket.get_key("myinfo.json")
    s3file = key.get_contents_as_string()
    lencp = cipher.encrypt(s3file)
    buf = StringIO.StringIO(lencp)
    print lencp
    print "connected to upload"
    up_bucket = conn.get_bucket("s3upload")
    up_key = up_bucket.new_key('enc.txt').set_contents_from_string(buf.readline(),replace=True,policy='public-read')
    print "completed upload"
    return

Upvotes: 1

Views: 338

Answers (1)

leela prasad
leela prasad

Reputation: 31

Solved the problem it was due to policy='public-read' ,after removing this able to perform the upload and also a note if in IAM role if you still enable all S3 functions (i.e PutObject,getObject) upload can't work.Need to create a bucket policy for this particular role then only upload work smoothly.

Upvotes: 1

Related Questions