Reputation: 5934
I'm having issues accessing my custom python modules that are in a lambda layer from within a python subprocess. My use case is thus:
I have a large custom module with many dependencies defined in a lambda layer (which I will call myCustomModule.py
). I allow arbitrary code to be uploaded as a lambda function in a file called function.py
. That script gets called in a subprocess in app.py
, which contains the handler that Lambda calls.
myCustomModule.py
def printFoo():
print("foo")
function.py
import myCustomModule
myCustomModule.printFoo()
app.py
import subprocess
def handler(event, context):
functionFilePath = "function.py"
return subprocess.check_output("python3 " + functionFilePath, shell=True)
When I execute the lambda function, it returns a ModuleNotFoundError
saying function.py cannot find myCustomModule. However, if I import myCustomModule in app.py and call printFoo()
from there, it works fine. However, that breaks my use case because I need to be able to call myCustomModule from function.py, and not app.py.
How can I get the python subprocess to recognize modules in my lambda layer?
Upvotes: 1
Views: 1036
Reputation: 5934
I figured out what the issue was. I needed to do two things:
First I needed to add the location to my module from my lambda layer to my PYTHONPATH
environment variable:
Next, I needed to increase the memory and timeout size of the function, since my module was pretty large:
You can do this either in the Lambda console, with the AWS CLI or with the AWS SDK (I'm using the Java SDK).
Upvotes: 1