Josh Gredvig
Josh Gredvig

Reputation: 41

Getting self signed certificate error with pip install within Docker, but only for certain packages

I'm just playing with a simple example to get a basic understanding of Docker going. Here is my Docker image file:

FROM python:3.7-alpine

# copy all the files to the container
COPY . /test
WORKDIR /test

# install dependencies
RUN pip install pip_system_certs --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org
RUN pip install -r requirements.txt



# run the command
CMD ["python", "./test_script.py"]

The trusted-host options are what allow us to get around corporate network security settings and install packages internally on windows and they seem to work in Docker too but only for some packages. For instance if my requirements.txt includes flask and requests everything is fine, but pandas and numpy give me

WARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)'))': /simple/numpy/

and fails. I think it's weird that this is working for some packages but not others.

Any help appreciated.

Using Docker Desktop in Windows 10.

Upvotes: 2

Views: 5540

Answers (1)

sql_knievel
sql_knievel

Reputation: 1401

I know my company's big corporate proxy removes (most) normal certificates and re-wraps them in a self-signed cert. This caused lots of similar headaches for me. I resolved it by:

  • Figuring out what our root cert was by visiting an internet site in Chrome, clicking on the lock in the address bar, and viewing the certification path for the site's certificate. The root CA was our internal one.
  • Going to the certificate management in Windows control panel and under "Trusted Root Certification" found my company's internal root cert and exported it as a "Base-64 encoded X.509" file.
  • Copied that certificate file into my Docker container and added it as a CA certificate to the "os" inside my container. After that, everything I ran in my container just worked.

The catch with step 3 here is that exactly how you do this is different for different flavors of linux. I don't know much about alpine, but these links might get you pointed in roughly the right direction: https://blog.confirm.ch/adding-a-new-trusted-certificate-authority/
https://github.com/gliderlabs/docker-alpine/issues/260

Also, bonus catch - if you use python's requests library in your application, it doesn't use the system CA certs by default. If this is a problem for you, read about setting the REQUESTS_CA_BUNDLE in the accepted answer here: Python Requests - How to use system ca-certificates (debian/ubuntu)?

Upvotes: 5

Related Questions