Reputation: 207
Logic: I have a AWS Lambda function which is performing insert operation into a table in RDS database. The lambda function is used to simply load a 'emp_details.csv' file data into a table 'emp_details' which is in RDS database. Works fine for 25,000 rows and 54 columns of data (file size 25MB) in 'emp_details.csv' file.
Problem: Now the file 'emp_details.csv' has 500,000 rows and 40 columns of data (file size 400MB), AWS lambda is getting timed out after 15mins (max time out configuration) and data is not getting inserted in the 'emp_details' table as the insert operation has not completed yet.
How to handle this problem? I have already increased Memory size of AWS Lambda function also to 2GB.
Upvotes: 0
Views: 976
Reputation: 743
AWS Lambda is not designed to support long running background processes. Ideal solution would be using ECS which can listen the event using SQS may be to process this.
However there is a bad solution as well if you don't want to go to ECS
Upvotes: 1