Reputation: 133
I was able to successfully follow this tutorial here to zip a scikit-learn
package and create a layer. Now I have two layers loaded, one for scikit-learn
and one for numpy
and scipy
that AWS already has. You can see that below
When I try to run my lambda now, I get an error that states that
"errorMessage": "Unable to import module 'lambda_function': No module named 'pandas'"
Thus I tried to do the same process with zipping up a pandas file together but when I try to load it, I exceed that 50mb maximum. Is there anyway of loading in pandas and sklearn together so that I do not go above this maximum limit?
Upvotes: 1
Views: 5636
Reputation: 748
You can use the following repo which includes a process for creating lambda layers. It includes configuration for sklearn (with pandas, numpy and scipy).
You clone it and use a CLI with details like python version and architecture.
https://github.com/imperva/aws-lambda-layer
Note: Docker must be installed on the machine you run the process (you can use docker desktop) since it uses docker build
Upvotes: 1
Reputation: 238299
The 50 MB limit is for upload only. From docs:
50 MB (zipped, for direct upload)
If you want to have greater layers, you have to upload them to S3 first. Then the 50 MB limit does not apply and you are constrained by:
250 MB (unzipped, including layers)
Upvotes: 1
Reputation: 6998
You can use docker container as Lambda images. This will allow you to use way bigger libraries.
Upvotes: 1