Reputation: 37
I am executing my test configuration and this is the error I am facing. I have a trained model of size 327mb and layers of 250mb required for the inference of my Text To Speech trained model. So the size of model and layers might be the reason?? Please help me clarify and provide a solution. I am importing the trained model from s3 bucket and then loading it for the further processing.
Upvotes: 0
Views: 4069
Reputation: 21510
AWS Lambdas local storage in /tmp
is only 512MB
. You are apparently exceeding this limit.
There are five solutions I can think of:
It is hard to tell what the best solution for you is, since so much information is missing. But those solutions should give you a good starting point.
Upvotes: 1