rpicatoste
rpicatoste

Reputation: 529

Writing the requirements/setup file for a Python package

Are there any best practices on how to select the versions of the required packages for your own python package?

You can always do pip freeze > requirements.txt, but this will set every used package to a specific version.

If this package is used with another one using the same requirement with a different specific version, you will have a problem when anaconda, pip-tools, or poetry try to find the right combination of dependencies.

To let your package work with others, the best would be to let the version selected be as wide as possible, for example numpy>=1.20,<1.21.

Is there a good way to do this for the entire set of requirements of a package?

Upvotes: 0

Views: 79

Answers (1)

sinoroc
sinoroc

Reputation: 22295

In the packaging metadata of your library (in other words: in the setup.py, setup.cfg, or pyproject.toml), only the direct dependencies should be listed. The direct dependencies are the ones that are directly imported by the library's code (and the ones that are called in sub-processes, but that is quite a rare case). Since no one can predict the future, my advice regarding version constraints on the dependencies is to only exclude the versions (or version ranges) for which you are 100% sure that they are incompatible with your library.

For example, we are working on MyLib v9 and want to add a dependency on SomeLib which has only v1 and v2 available. The features of SomeLib that we want are not available in v1 but they are on v2. We do not know if those features will still be there in SomeLib v3+ or not, since we can not predict the future. So in the packaging metadata of our MyLib v9, we should declare dependency SomeLib>=2 (or SomeLib!=1).

Some months later SomeLib v3 and then SomeLib v4 are released, and they are indeed compatible with our library. We do not need to do anything, whoever installs our MyLib will automatically get the latest version of SomeLib, which is v4 and is compatible.

Some more months later, SomeLib v5 is released and it is not compatible with our library. So we should release MyLib v9.post1 where we adjust the dependency specification in packaging metadata with SomeLib>=2,<5 (or SomeLib!=1,!=5).

If when the incompatible SomeLib v5 is released, we have all abandoned the maintenance of our MyLib and do not release the post-build, then users of our MyLib are still able to manually exclude the problematic SomeLib v5 from dependency resolution (for example with a pip constraints.txt file). This is well supported in the Python packaging ecosystem, very easy to do.

On the other hand, if right from the start we had eagerly excluded everything but SomeLib v2 in our initial MyLib v9 release (maybe with a dependency specification such as SomeLib==2), and then immediately abandoned the maintenance of the library, then no one would have been able to install our MyLib v9 with SomeLib v3 (or v4), even though it is a perfectly valid combination. This eager over-specification of exclusions is the cause of a lot of (unsolvable) issues, a near dead-end, this should really be avoided.

That is why, from my point of view, the only thing dependency version constraints should do is exclude well-known (and proven) incompatibilities. Then, as much as possible, publish "post build" releases to improve the version constraints as new incompatibilities appear.

References

Upvotes: 1

Related Questions