Ritu Chawla
Ritu Chawla

Reputation: 21

How do we import Pandas, Spacy, Numpy, NLTK in a single AWS lambda function using Python 3.6?

I have the following libraries being used in my AWS Lambda function:

pytz/     
nltk/                 
nltk-3.2.5/          
numpy/
psycopg2/
pandas/  
spacy/

But while I zip and upload these libraries along with my code on AWS S3 and link the S3 zip to Lambda Function it gives the following error:

Unzipped size must be smaller than 262144000 bytes This what happens when you try to save the zipped file on lambda

The size of my Zip is 62 Mb and AWS Lambda supports only 50MB per lambda function.

Is there any better way in AWS to achieve this?

Upvotes: 2

Views: 1439

Answers (2)

P. Str
P. Str

Reputation: 670

You can use "Lambda Layers" as a work around: https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html

All you have to do is to package your libs (such as spacy) in a lambda layer. Then you have to attach the layer to the lambda function so that the lambda is able to use the libs from the layer. There are size limitiations but you can deploy multiple layers. You can use the same layer in multiple lambdas! In case of spacy I would recommend the removal of unnecessary languages from the /spacy/lang directory!

Layers is a nice way to stay DRY (Do not Repeat Yourself)

Upvotes: 0

Morris wong
Morris wong

Reputation: 163

Seems like the "real" limits are 250MB from this blog post:

Just to quote from the article, seems like what you can do is first to upload the zip file to s3:

aws s3 cp ./ s3://limits-test-foobar-bucket/ --recursive --exclude "*" --include "*.zip"

And then update the lambda function using the aws cli as well:

aws lambda update-function-code --function-name limits-test --region us-east-1 --s3-bucket limits-test-foobar-bucket --s3-key 100MB.zip

Hope that helps!

Upvotes: 0

Related Questions