Levin
Levin

Reputation: 2015

Why use requirements.txt in a Docker image

There is a similar question from last year but I don't think the responses are widely applicable and it's not accepted.

This is in the context of developing small jobs that will only be run in Docker in-house; I'm not talking about sharing work with anyone outside a small team, or about projects getting heavy reuse.

What advantage do you see in using requirements.txt to install instead of pip install commands in Dockerfile? I see one: your Dockerfile for various projects is more cookie-cutter.

I'm not even thinking of the use of setup envisioned in the question I linked.

What downside is there to naming the packages in Dockerfile:

 RUN   pip install --target=/build  django==3.0.1 Jinja2==2.11.1 .  . .

@superstormer asked "what are the upsides to putting it in Dockefile?". It is a fair question. I read coworkers' Dockerfiles in GitLab and have to navigate to the requirements. I don't have it locally in an editor. Note to self: so clone it and look at it in an editor.

Upvotes: 4

Views: 7437

Answers (2)

Jerry101
Jerry101

Reputation: 13427

First consider going with the flow of the tools:

  • To manually install those packages, inside or outside a Docker Container, or to test that it works without building a new Docker Image, do pip install -r requirements.txt. You won't have to copy/paste the list of packages.
  • To "freeze" on specific versions of the packages to make builds more repeatable, pip freeze will create (or augment) that requirements.txt file for you.
  • PyCharm will look for a requirements.txt file, let you know if your currently installed packages don't match that specification, help you fix that, show you if updated packages are available, and help you update.
  • Presumably other modern IDEs do the same, but if you're developing in plain text editors, you can still run a script like this to check the installed packages (this is also handy in a git post-checkout hook):
    echo -e "\nRequirements diff (requirements.txt vs current pips):"
    diff --ignore-case <(sed 's/ *#.*//;s/^ *--.*//;/^$/d' requirements.txt | sort --ignore-case) \
      <(pip freeze 2>/dev/null | sort --ignore-case) -yB --suppress-common-lines
    

Hopefully this makes it clearer that requirements.txt declares required packages and usually the package versions. It's more modular and reusable to keep it separate than embed it inside a Dockerfile.

Upvotes: 4

Adam Smith
Adam Smith

Reputation: 54223

It's a question of single responsibility.

Dockerfile's job is to package an application up to be built as an image. That is: it should describe every step needed to turn an application into a container image.

requirements.txt's job is to list every dependency of a Python application, regardless of its deployment strategy. Many Python workflows expect a requirements.txt and know how to add new dependencies while updating that requirements.txt file. Many other workflows can at least interoperate with requirements.txt. None of them know how to auto-populate a Dockerfile.


In short, the application is not complete if it does not include a requirements.txt. Including that information in the Dockerfile is like writing documentation that teaches your operations folks how to pull and install every individual dependency while deploying the application, rather than including it in a dependency manager that packages into the binary you deliver to ops.

Upvotes: 2

Related Questions