Jwan622
Jwan622

Reputation: 11659

Deploying with serverless framework not working because of psycopg2. Docker installation of psycopg2 not working

So this is my serverless.yml file that is relevant:

plugins:
  - serverless-python-requirements

# registers the plugin with Serverless
# hooking into the Framework on a deploy command. Before your package is zipped, it uses Docker to install the
# packages listed in your requirements.txt file and save them to a .requirements/ directory. It then symlinks the
# contents of .requirements/ into your top-level directory so that Python imports work as expected.
custom:
  pythonRequirements:
    dockerizePip: non-linux
    zip: true
    slim: true

In my requirements.txt file, I have this: psycopg2==2.8.3

When I run sls deploy, I see this:

Error: pg_config executable not found.

    pg_config is required to build psycopg2 from source.  Please add the directory
    containing pg_config to the $PATH or specify the full executable path with the
    option:   python setup.py build_ext --pg-config /path/to/pg_config build

And my pg_config script is in /env/local/bin as: pg_config@ -> ../Cellar/postgresql/11.5_1/bin/pg_config

What else can I do? In short, psycopg2 needs to be built in docker so that the binary that is created is suitable for aws lambda. I can't get this to work using the serverless-python-requirements plugin. What else can I do?

Upvotes: 2

Views: 2737

Answers (2)

abk
abk

Reputation: 341

I tried @vallard's solution and got an error in AWS Lambda:

ImportModuleError: Unable to import module 'handler.py' libpq.so.5: cannot open shared object file: No such file or directory

Finally got it to work by adding psycopg2-binary to the handler's requirements.txt used by the serverless-python-requirements plugin.

Upvotes: 1

vallard
vallard

Reputation: 1080

I had the same issue and many of the posts I saw didn't help me. For example, this article here [1] had some great help in terms of the need for security groups and subnets to be set up correctly, but looking at the code [2] I don't see how it installed cause it would get the same error.

What I did was I created a Dockerfile in the top serverless directory that I then injected the files needed. The Dockerfile looked like this:

FROM lambci/lambda:build-python3.7
RUN yum install -y postgresql-devel python-psycopg2 postgresql-libs

Then in the serverless.yaml file I added this:

custom:
  pythonRequirements:
    dockerizePip: non-linux
    dockerFile: ./Dockerfile

From the docs [3], this will take the Dockerfile (which now has pg_config) and add the requirements.txt to it. You may need to clear the cached version if this comes back super fast.

Protip: If you get it to compile and then you get 30 second timeouts when you invoke the function, you probably need to change the sercurity groups and/or subnets to make sure it can access your Redshift cluster.

Hope that helps!

  1. https://serverless.com/blog/etl-job-processing-with-serverless-lambda-and-redshift/
  2. https://github.com/rupakg/serverless-etl
  3. https://github.com/UnitedIncome/serverless-python-requirements

Upvotes: 9

Related Questions