Reputation: 263
I am planning to invoke AWS Lambda function by modifying objects on AWS S3 bucket. I also need to send a large amount of data to the AWS Lambda function. How can I send the data to it in an efficient way?
Upvotes: 2
Views: 2433
Reputation: 106
I recently did this by gzipping the data before invoking the lambda function. This is super easy to do with most programming languages. Depending on your database content this will be a better or worse solution. The content of my database had a lot of data repetition and zipped very nicely.
Upvotes: 1
Reputation: 33
Your Lambda function should just read from the database your large data resides in.
Assuming your modified object on S3 contains - inside the object or as the object name - some type of foreign key to the data you need out of your database:
A) If your Lambda has access to the database directly: then you can just make your lambda function query your database directly to pull the data.
B) If your Lambda does not have direct access to the database: Then consider cloning the data as needed from the database to a secure S3 bucket for access by your lambda's when they are triggered/need it. Clone the data to S3 as JSON or some other easy to read format as logical objects for your business case (orders, customers, whatever). This method will be the fastest/most efficient for the Lambda if its possible for your use case.
Upvotes: 1
Reputation: 1209
I would use another S3 bucket to first send the data and then use it from the Lambda function
Upvotes: 4