Reputation: 367
I am building a Python module with a structure like:
mypackage/
mypackage/
__init__.py
etc.py
setup.py
setup.cfg
pyproject.toml
To build it, I am running $ python -m build
. I noticed that version numbers weren't available (e.g. mypackage.__version__
is undefined after installing), and currently I am just setting it manually like:
setup.py
setup(..., version='0.0.1' )
pyproject.toml
[project]
version = '0.0.1'
I am new to Python package development and there are a few posts on this, but there does not seem to be a standard way of doing it.
The package is quite small and ideally I'd like to just update one thing like __version__ = '0.0.1'
inside __init__.py
, and then have this parsed automatically in setup.py
and pyproject.toml
.
Upvotes: 11
Views: 10103
Reputation: 2505
Here is an example how to add version attribute to python module using hatchling
# dir tree
├── my_package
│ ├── foo.py
│ └── __init__.py
└── pyproject.toml
# pyproject.toml
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.hatch.build.targets.sdist]
packages = ['my_package']
[tool.hatch.build.targets.wheel]
packages = ['my_package']
[project]
requires-python = ">= 3.10"
name='my_package'
# this is important thing
dynamic = ["version"]
[tool.hatch.version]
# from <https://hatch.pypa.io/1.13/version/>
path = "my_package/__init__.py"
# __init__.py
__version__ = '0.0.1'
Result after building and/or installing:
import my_package
# will return '0.0.1'
my_package.__version__
Upvotes: 0
Reputation: 362746
__version__
attribute at all?Keeping a __version__
attribute available in the module namespace is a popular convention, but it's possibly falling out of fashion these days because stdlib importlib.metadata
is no longer provisional. The one obvious place for a version string is in the package metadata, duplicating that same information in a module attribute may be considered unnecessary and redundant.
It also presents some conundrums for users when the version string exists in two different places - where should we look for it first, in the package metadata or in a module top-level namespace? And which one should we trust if the version information found in each of these places is different?
So, there is some benefit to only storing it in one place, and that place must be the package's metadata. This is because the Version
field is a required field in the Core metadata specifications, but packages which opt-in to providing a __version__
attribute are just following a convention.
If you're using a modern build system, then you would specify the version string directly in pyproject.toml
as described in PEP 621 – Storing project metadata in pyproject.toml. The way already shown in the question is correct:
[project]
name = "mypkg"
version = "0.0.1"
Users of mypkg
could retrieve the version like so:
from importlib.metadata import version
version("mypkg")
Note that unlike accessing a __version__
attribute, this version is retrieved from the package metadata only, and the actual package doesn't even need to be imported. That's useful in some cases, e.g. packages which have import side-effects such as numpy, or the ability to retrieve versions of packages even if they have unsatisfied dependencies / complicated environment setup requirements.
Sometimes this is useful, for example to add a --version
option to your command-line interface. But this doesn't imply you need a __version__
attribute hanging around, you can just retrieve the version from your own package metadata the same way:
parser = argparse.ArgumentParser(...)
...
parser.add_argument(
"--version",
action="version",
version=importlib.metadata.version("mypkg"),
)
Upvotes: 17
Reputation: 16590
I have no idea why other answers say that your approach is not recommended, on the contrary, it is the officially recommended one (as of 2023). Core Python developers use this approach. What you need to make sure is to define the __version__
string statically, not dynamically (ie, do not fetch at runtime from a text file, it must be hardcoded in a Python string in a Python file, to be statically available at build time).
Since setuptools v61, you can indeed set a __version__
attribute in your package's __init__.py
file as you did, and then dynamically fetch it into your pyproject.toml
like so:
[build-system]
requires = ["setuptools>=61"]
build-backend = "setuptools.build_meta"
[project]
name = ["mypackage"]
dynamic = ["version"]
[tool.setuptools.dynamic]
version = {attr = "mypackage.__version__"}
Note however that if you also want to access mypackage.__version__
from inside mypackage
, you should NOT from . import __version__
because if you also import things in your __init__.py
, this will cause an infinite import loop! Instead, you need to implement a function to read (not import!) __init__.py
and extract the version string, fortunately the official documentation nowadays provides an easy example:
import codecs
import os.path
def read(rel_path):
here = os.path.abspath(os.path.dirname(__file__))
with codecs.open(os.path.join(here, rel_path), 'r') as fp:
return fp.read()
def get_version(rel_path):
for line in read(rel_path).splitlines():
if line.startswith('__version__'):
delim = '"' if '"' in line else "'"
return line.split(delim)[1]
else:
raise RuntimeError("Unable to find version string.")
version = get_version("mypackage/__init__.py")
If you only want to support Py3, then you are all set, you can stop reading here.
But if you need to use an older version of setuptools, eg if you need to support Python 2.7 (which is very disadvised but some legacy projects may need it), then you can instead use setup.cfg
, which supports dynamic version since an earlier version of setuptools:
[metadata]
name = mypackage
version = attr: mypackage.__version__
Note that you still need to have an almost empty setup.py
for the setup.cfg
to work, otherwise setuptools will choke (it's an official rule):
from setuptools import setup
setup()
But then, if you also want your module to have both a setup.cfg and setup.py to support Py2 and also have a pyproject.toml to support Py3, you will notice that building may fail under Py2 because of requiring a setuptools version too high (the latest on Py2 being v41.1.1). Indeed, even if on Py2 building is done through setup.cfg and setup.py, if you updated pip, then it will STILL access pyproject.toml and access the [build-system]
table, because this is the only place where a module can specify build-time requirements, according to PEP 517.
To fix this issue, you need to edit the build requirement in pyproject.toml
to specify the python version (see PEP 508 about dependencies specifications):
[build-system]
requires = ["setuptools>=44;python_version<'3'", "setuptools>=61;python_version>='3'"]
build-backend = "setuptools.build_meta"
Then your package should build fine (with the build
module or pip install --pep-517
) under both Py3 and Py2, using only the pyproject.toml for the former, or setup.cfg for the latter (with a bit of pyproject.toml just for the build-system).
Upvotes: 3
Reputation: 198324
Looking at various popular libraries should give some ideas.
One simple way is to parse your __init__.py
inside setup.py
, like this example from the wonderful diff-match-patch library:
with open("diff_match_patch/__init__.py") as f:
for line in f:
if line.startswith("__version__"):
version = line.split('"')[1]
As far as I saw, most of the popular Python projects do not put the version in pyproject.toml
, if they even have it.
Alternately, you could use the nifty versioneer
library, which picks up the version from git. For example, at the time I write this answer, the latest tag in numpy
repository's main
branch history is v1.22.3
, and it is cleanly reflected in numpy.__version__
being 1.22.3
, without practically any work on the numpy
developers' part.
Upvotes: 3