panza
panza

Reputation: 1431

Lambda function and triggered S3 events

I have a lambda function that gets triggered every time a file is written onto an S3 bucket. My understanding is that every time a single file gets in (this is a potential scenario, rather than having a batch of files being sent), an API call is fired up and that means that I am charged. My question is: can I batch multiple files so that each API calls will only be called if, for example, I have a batch of 10 files? Is this a good practice? I should not be in the position of having a processing time greater than 15 minutes, so the use of the lambda is still fine.

Thank you

Upvotes: 2

Views: 1869

Answers (2)

Showmik Bose
Showmik Bose

Reputation: 109

  • 1 - One solution is group your files into a rar and put into S3. Thus for multiple files, your api will be triggered once only.

  • 2 - The other solution as said by kamprasad is to use SQS.

  • 3 - One last solution that I can think of is to use a cronjob to trigger the lambda as per your requirement. Inside your lambda do the processing using threads to make your task done faster. Keep in mind you have to choose Memory and time carefully in this scenario.

I've personally used the last solution quite frequently.

Upvotes: 1

kamprasad
kamprasad

Reputation: 648

You can use SQS to decouple this scenario, the lambda triggering point will be SQS, in there you can set batch size whatever you want.

enter image description here

Upvotes: 4

Related Questions