Reputation: 353
I am designing a state machine which will run two lambda functions that return each a Json array in parallel. Also, the result of these functions are then pass to two more lambda functions that will took those inputs and added to a database. I have all the functions ready and working separately, but when I execute the state machine it says that one of the executions failed because of DataLimitExceeded. I checked the documentation and it says that the limit for input or output results is 32.768 characters. The odd thing is that the execution that is successful is the one that the Json object returned is about 50k characters, and the one that is failing its about 46k characters. So, if both are exceeding the limit, why one of them is failing and the other one not!
{
"StartAt": "Sync",
"States": {
"Sync": {
"Type": "Parallel",
"Next": "EnviarNotificacion",
"Branches": [
{
"StartAt": "SyncClientes",
"States": {
"SyncClientes": {
"Type": "Task",
"Resource": "arn...",
"Next": "AddClientes"
},
"AddClientes" : {
"Type": "Task",
"Resource": "arn...",
"End": true
}
}
},
{
"StartAt": "SyncArticulos",
"States": {
"SyncArticulos": {
"Type": "Task",
"Resource": "arn...",
"Next": "AddArticulos"
},
"AddArticulos": {
"Type": "Task",
"Resource": "arn...",
"End": true
}
}
}
]
},
"EnviarNotificacion": {
"Type": "Pass",
"End": true
}
}
}
Thanks a lot!
Upvotes: 7
Views: 8354
Reputation: 1713
I have built a middleware middy-store
that handles the upload and download automatically. The middleware checks the size of the output of a lambda and uploads the output to S3 if it is too large and only returns the S3 URL or ARN. For the next lambda, it checks whether the input contains an S3 URL or ARN and downloads it again. This happens behind the scenes without much configuration:
import { middyStore } from 'middy-store';
import { S3Store } from 'middy-store-s3';
const handler = middy()
.use(
middyStore({
stores: [
new S3Store({
bucket: "your-bucket",
}),
],
}),
)
.handler(async (input) => {
return { /* your handler */ };
});
Upvotes: 0
Reputation: 353
Well someone give me another solution which was good to me, so I post it here if anyone has similar problems like this. What I do was simple saving the data I needed in a file on S3, and on the second function I read this file from S3 and works just fine. One thing I noticed is that when I saved the files on S3, the data of the function that was working fine was 31kb while the one failing was 35kb so maybe the maximum size is 32kb and not characters as the documentation says.
Upvotes: 11