igsm
igsm

Reputation: 1670

Activate python virtualenv in Dockerfile

I have a Dockerfile where I tried to activate python virtualenv so that, it should install all dependencies within this env. However, everything still gets installed globally. I used different approaches and none of them worked. I am also not getting any errors. Where is the problem?

1. ENV PATH $PATH:env/bin

2. ENV PATH $PATH:env/bin/activate

3. RUN . env/bin/activate

I also followed an example of a Dockerfile config for the python-runtime image on Google Cloud, which is basically the same stuff as above.

Setting these environment variables are the same as running source /env/bin/activate.

ENV VIRTUAL_ENV /env

ENV PATH /env/bin:$PATH

Additionally, what does ENV VIRTUAL_ENV /env mean and how it is used?

Upvotes: 129

Views: 178260

Answers (9)

user2138149
user2138149

Reputation: 16625

Sometimes you have to use venv within a docker container.

Some docker image authors build their containers in such a way that they will not allow you to pip install without creating a venv first.

(There may be ways around this, but why fight against the system?)

One way to make it work is to do the following:

RUN python3 -m venv venv
RUN ./venv/bin/pip install <list of packages to install>
ENTRYPOINT ["./venv/bin/python3", "main.py"]

In other words, call python3 and pip from within the venv directly.

If you have a requirements.txt:

COPY ./requirements.txt .
RUN python3 -m venv venv
RUN ./venv/bin/pip3 install --no-cache-dir -r requirements.txt
ENTRYPOINT ["./venv/bin/python3", "main.py"]

Further info here:

Upvotes: 4

Necro
Necro

Reputation: 571

All python programs executing within a virtual env have to have that env activated first. Activation must be done by a parent process, before running the child python, or very early in the child python process. The parent is often bash, but in a Dockerfile, the parent could be your ENTRYPOINT program. To activate you must:

  1. Un-set PYTHONHOME
  2. Prepend the virtual env's path to PATH
  3. Pass in at least these environment vars to the child python process when doing exec

For example, if your parent process or ENTRYPOINT were a golang process you might do something like this before executing the python sub-process:

    // Our python program uses virtual environments, so activate the virtual
    // environment for python sub-processes before running it, so the
    // env vars can be inherited when its executed.
    execpath := os.Getenv("PATH")
    os.Setenv("PATH", "/venv/bin:"+execpath)
    os.Unsetenv("PYTHONHOME")

...if the virtual env were at /venv for example.

Upvotes: 0

gdm
gdm

Reputation: 7930

The only solution that worked to me is this

CMD ["/bin/bash", "-c", "source <your-env>/bin/activate && cd src && python main.py"]

Upvotes: 0

Marcus Lind
Marcus Lind

Reputation: 11450

You don't need to use virtualenv inside a Docker Container.

virtualenv is used for dependency isolation. You want to prevent any dependencies or packages installed from leaking between applications. Docker achieves the same thing, it isolates your dependencies within your container and prevent leaks between containers and between applications.

Therefore, there is no point in using virtualenv inside a Docker Container unless you are running multiple apps in the same container, if that's the case I'd say that you're doing something wrong and the solution would be to architect your app in a better way and split them up in multiple containers.


EDIT 2022: Given this answer get a lot of views, I thought it might make sense to add that now 4 years later, I realized that there actually is valid usages of virtual environments in Docker images, especially when doing multi staged builds:

FROM python:3.9-slim as compiler
ENV PYTHONUNBUFFERED 1

WORKDIR /app/

RUN python -m venv /opt/venv
# Enable venv
ENV PATH="/opt/venv/bin:$PATH"

COPY ./requirements.txt /app/requirements.txt
RUN pip install -Ur requirements.txt

FROM python:3.9-slim as runner
WORKDIR /app/
COPY --from=compiler /opt/venv /opt/venv

# Enable venv
ENV PATH="/opt/venv/bin:$PATH"
COPY . /app/
CMD ["python", "app.py", ]

In the Dockerfile example above, we are creating a virtualenv at /opt/venv and activating it using an ENV statement, we then install all dependencies into this /opt/venv and can simply copy this folder into our runner stage of our build. This can help with minimizing docker image size.

Upvotes: 141

Ellis Percival
Ellis Percival

Reputation: 5032

There are perfectly valid reasons for using a virtualenv within a container.

You don't necessarily need to activate the virtualenv to install software or use it. Try invoking the executables directly from the virtualenv's bin directory instead:

FROM python:2.7

RUN virtualenv /ve
RUN /ve/bin/pip install somepackage

CMD ["/ve/bin/python", "yourcode.py"]

You may also just set the PATH environment variable so that all further Python commands will use the binaries within the virtualenv as described in https://pythonspeed.com/articles/activate-virtualenv-dockerfile/

FROM python:2.7

RUN virtualenv /ve
ENV PATH="/ve/bin:$PATH"
RUN pip install somepackage

CMD ["python", "yourcode.py"]

Upvotes: 97

Sergey Nevmerzhitsky
Sergey Nevmerzhitsky

Reputation: 124

Consider a migration to pipenv - a tool which will automate virtualenv and pip interactions for you. It's recommended by PyPA.

Reproduce environment via pipenv in a docker image is very simple:

FROM python:3.7

RUN pip install pipenv

COPY src/Pipfile* ./

RUN pipenv install --deploy

...

Upvotes: -6

monitorius
monitorius

Reputation: 3956

Setting this variables

ENV VIRTUAL_ENV /env
ENV PATH /env/bin:$PATH

is not exactly the same as just running

RUN . env/bin/activate

because activation inside single RUN will not affect any lines below that RUN in Dockerfile. But setting environment variables through ENV will activate your virtual environment for all RUN commands.

Look at this example:

RUN virtualenv env                       # setup env
RUN which python                         # -> /usr/bin/python
RUN . /env/bin/activate && which python  # -> /env/bin/python
RUN which python                         # -> /usr/bin/python

So if you really need to activate virtualenv for the whole Dockerfile you need to do something like this:

RUN virtualenv env
ENV VIRTUAL_ENV /env                     # activating environment
ENV PATH /env/bin:$PATH                  # activating environment
RUN which python                         # -> /env/bin/python

Upvotes: 51

Chirag Maliwal
Chirag Maliwal

Reputation: 442

If you your using python 3.x :

RUN pip install virtualenv
RUN virtualenv -p python3.5 virtual
RUN /bin/bash -c "source /virtual/bin/activate"

If you are using python 2.x :

RUN pip install virtualenv
RUN virtualenv virtual
RUN /bin/bash -c "source /virtual/bin/activate"

Upvotes: -3

pinty
pinty

Reputation: 499

Although I agree with Marcus that this is not the way of doing with Docker, you can do what you want.

Using the RUN command of Docker directly will not give you the answer as it will not execute your instructions from within the virtual environment. Instead squeeze the instructions executed in a single line using /bin/bash. The following Dockerfile worked for me:

FROM python:2.7

RUN virtualenv virtual
RUN /bin/bash -c "source /virtual/bin/activate && pip install pyserial && deactivate"
...

This should install the pyserial module only on the virtual environment.

Upvotes: 18

Related Questions