Reputation: 46
I'm working on uploading documents from DynamoDB to Cloudsearch using a Lambda function and DynamoDB Stream.
One thing I'm running into is that the dynamodb event source batch size is 6MB and the CloudSearch document uploadload limit is 5MB. So this means that I can receive up to 6mb of data to use in a Lambda function.
Knowing this, I decided to look for ways to decrease the Dynamodb event source batch size to 5MB with no luck.
One idea I had is, since the records returned from DynamoDB has the size of each record, is use the size to send only a certain amount of records that is within 5MB and keep an array of the excess records outside of the execution environment. So during the next invocation of the Lambda function, the unsent records can be uploaded to CloudSearch. (If this is even how it would work)?
But then that brings it's own set of problems.
I also looked at editing the batch size, but it only limits max number of records to return per batch, not accounting for the batch size in MB.
Does anyone have any ideas here? Should I even worry about it? I'm sure there is code out there that accounts for this.
Upvotes: 1
Views: 79