Reputation: 23972
My Amazon Lambda function (in Python) is called when an object 123456
is created in S3's input_bucket
, do a transformation in the object and saves it in output_bucket
.
I would like to notify my main application if the request was successful or unsuccessful. For example, a POST http://myapp.com/successful/123456
if the processing is successful and http://myapp.com/unsuccessful/123456
if its not.
One solution I thought is to create a second Amazon Lambda function that is triggered by a put event in output_bucket
, and it to do the successful POST request. This solves half of the problem because but I can't trigger the unsuccessful POST request.
Maybe AWS has a more elegant solution using a parameter in Lambda or a service that deals with these types of notifications. Any advice or point in the right direction will be greatly appreciated.
Upvotes: 0
Views: 1007
Reputation: 19758
Few possible solutions which I see as elegant
You could arrange to run a Lambda function every time a new object is uploaded to an S3 bucket. This function can then kick off a state machine execution by calling StartExecution. The advantage in using step functions is that you can coordinate the components of your application as series of steps in a visual workflow.
Upvotes: 2
Reputation: 66667
I don't think there is any elegant AWS solution, unless you re-architect, something like your lambda sends message to SQS or some intermediatery messaging service with STATUS and then interemdeiatery invokes POST to your application.
If you still want to go with your way of solving, you might need to configure "DeadLetter queue" to do error handling in failure cases (note that use cases described here are not comprehensive, so need to make sure it covers your case) like described here.
Upvotes: 1