ozil
ozil

Reputation: 679

how to limit number of times aws lambda executes on failure?

I am new to Python and AWS.following a tutorial, I created a S3 bucket, when object is created in s3 it triggers following lambda function ( see code below). while testing I found, initially i had some error in my code, so when i uploaded a file , lambda executes then if it threw an error. it tries to execute again and again. I am guessing this is default nature of lambda function, if it fails it tries to execute it again. I want to add some error handling and may be some logging too. also, if there an error with processing a file, can i limit it so that lambda just executes once and doesn't keep trying.

import boto3
import os
from requests_aws4auth import AWS4Auth

session = boto3.Session()
credentials = session.get_credentials()
aws4auth = AWS4Auth(credentials.access_key,credentials.secret_key,region, service, session_token=credentials.token)
                    
s3 = boto3.resource('s3')

name = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']

obj = s3.Object(name,key) 

# get the object
response = obj.get()

print(response)

Upvotes: 0

Views: 151

Answers (1)

Marcin
Marcin

Reputation: 238957

S3 invokes your function asynchronously:

Amazon S3 invokes your function asynchronously with an event that contains details about the object. The following example shows an event that Amazon S3 sent when a deployment package was uploaded to Amazon S3.

Therefore you can set its retry attempts from defualt 2 to 0:

enter image description here

Upvotes: 2

Related Questions