Reputation: 13
I am doing a small project on python with scrapy. To upload it on AWS lambda , i simply created a folder,copied all the required libraries from my pc's site-packages , and deployed it on lambda ,which gave an error UNABLE TO IMPORT ETREE , i googled and found this solution Unable to import lxml etree on aws lambda . I haven't used docker everr, all i want to know is After i have run a docker image of amazon linux on my pc, how do i install all the libraries in there,and then get those exported out to my pc,so i can upload it.
Upvotes: 1
Views: 340
Reputation: 2358
You can install docker
and run a simple bash
script like this to create a layer
If you want to create a layer (and link to the lambda):
cd mylayer
docker run --rm -it -v ${PWD}:/var/task lambci/lambda:build-python3.6 pip install -r requirements.txt --no-deps -t python/lib/python3.6/site-packages/
zip -r ../my-layer.zip python
rm -rf python
cd -
If you want to create a lambda package:
cd mylambda
docker run --rm -it -v ${PWD}:/var/task lambci/lambda:build-python3.6 pip install -r requirements.txt --no-deps -t python/lib/python3.6/site-packages/
zip -r ../my-lambda-package.zip python lambda_function.py
rm -rf python
cd -
Upvotes: 1