Syed Waqas
Syed Waqas

Reputation: 2676

AWS Lambda in Python: Import parent package/directory in Lambda function handler

I have a directory structure like the following in my serverless application(simplest app to avoid clutter) which I created using AWS SAM with Python 3.8 as the runtime:

├── common
│   └── a.py
├── hello_world
│   ├── __init__.py
│   ├── app.py
│   └── requirements.txt
└── template.yaml

I would like to import common/a.py module inside the Lambda handler - hello_world/app.py. Now I know that I can import it normally in Python by adding the path to PYTHONPATH or sys.path, but it doesn't work when the code is run in Lambda inside a Docker container. When invoked, the Lambda handler function is run inside /var/task directory and the folder structure is not regarded.

I tried inserting /var/task/common, /var/common, /var and even /common to sys.path programmatically like this:

import sys
sys.path.insert(0, '/var/task/common')
from common.a import Alpha

but I still get the error:

ModuleNotFoundError: No module named 'common'

I am aware of Lambda Layers but given my scenario, I would like to directly reference the common code in multiple Lambda functions without the need of uploading to a layer. I want to have something like the serverless-python-requirements plugin in the Serverless framework but in AWS SAM.

So my question is, how should I add this path to common to PYTHONPATH or sys.path? Or is there an alternative workaround for this like [serverless-python-requirements][3] to directly import a module in a parent folder without using Lambda Layers?

Upvotes: 18

Views: 18620

Answers (2)

Syed Waqas
Syed Waqas

Reputation: 2676

I didn't find what I was looking for but I ended up with a solution to create a single Lambda function in the root which handles all the different API calls within the function. Yes my Lambda function is integrated with API Gateway, and I can get the API method and API path using event["httpMethod"] and event ["httpPath"] respectively. I can then put all the packages under the root and import them between each other.

For example, say I have 2 API paths /items and /employees that need to be handled and both of them need to handle both GET and POST methods, the following code suffices:

if event["path"] == '/items':
   if event["httpMethod"] == 'GET':
      ...
   elif event["httpMethod"] == 'POST':
      ...
elif event["path"] == '/employees':
   if event["httpMethod"] == 'GET':
      ...
   if event["httpMethod"] == 'POST':
      ...

So now I can have as much packages under this Lambda function. For example, the following is how repository looks like now:

├── application
│   └── *.py
├── persistence
│   └── *.py
├── models
│   └── *.py
└── rootLambdaFunction.py
└── requirements.txt

This way, I can import packages at will wherever I want within the given structure.

Upvotes: 3

weez
weez

Reputation: 21

I was having a similar issue with package dependencies and according to AWS Knowledge Center you should put all of your items at the root level:

You might need to create a deployment package that includes the function modules located in the root of the .zip file with read and execute permissions for all files.

According to https://aws.amazon.com/premiumsupport/knowledge-center/build-python-lambda-deployment-package/

Upvotes: 1

Related Questions