Reputation: 149155
This is kind of a followup to another question of mine.
I would like to provide localized versions of a package. Following the Python documentation, I have extracted a .pot file with pygettext, prepared a translation in a .po file, compiled it in a .mo file.
Everything is fine till there, and my package displays the translated messages.
But my final goal would be to make it available on PyPI. So I have done some research and found:
setuptools documentation: not even one single word about localization...
It explains that the format depends on the endianness of the platform where the file was generated. My understanding is that only the po files are portable...
What is the correct way to include localisation in python packages?
The answer is fully relevant and speaks of the setuptools/babel integration but:
Babel: compile translation files when calling setup.py install
An interesting way, even if it requires the babel module on the target platform. Not so heavy but way heavier than my own package... In fact, the distributions contain only po files and they are compiled with babel at install time.
Is there a way to build platform specific wheels containing compiled mo files?
If not I will have to require babel on target and try to find my way through mo compilation at install time.
Upvotes: 8
Views: 593
Reputation: 149155
After some work, I could build a specific package based on what is below in this answer. It can be used from other projects to automatically compile po files at build time through the magic of setuptools enty_points. It is now available on GitHUB (https://github.com/s-ball/mo_installer) and distributed on PyPI (https://pypi.org/project/mo_installer)
The researches that I did before asking the question gave me enough hints to reach a possible solution.
I can now say, thay is is possible to include a platform specific mo file in a wheel - unfortunately in my current solution the wheel gives no indication that it is platform specific. But the same solution allows to build a source distribution that build the mo file on the target platform.
Now for the details:
the tools needed to compile a mo file on the target:
Most solutions picked from Google or SO rely either on Babel or on the GNU gettext msgfmt
program. But cPython tools include a pure Python module msgfmt.py
that is enough here. Unfortunately, this tool is often not installed by default in many Linux/Unix-like. My solution just includes a copy of that module (a mere 7k file) for 3.7.1 version. It looks like a very stable code (few changes in recent years) and it should work for any Python >= 3.3
the setuptools integration
The magic of setuptools is that the same build subcommand is internally used to build a binary wheel, to install with pip from a source package or to directly install with python setup.py install
from a copy (git clone) of the full source package. So I provide a build
subclass in setup.py
that generates the .mo files with their full path before calling the superclass method. I also use a MANIFEST.in
file to list the files that should be copied in a source distribution and a package_data
setup argument to list what should go in a binary package or installation folder
run time usage
Provided the mo hierarchy to be installed under a knows package, os.dirname(__file__)
called from a module of that package gives its parent folder
Code (assuming the msgfmt.py
file is copied under a tools_i18n
folder and that po files are under a src
folder):
in setup.py
...
sys.path.append(os.path.join(os.path.dirname(__file__), "tools_i18n"))
import msgfmt
from distutils.command.build import build as _build
class Builder(_build):
def run(self):
# po files in src folder are named domain_lang.po
po = re.compile(r"(.*)_(.*).po")
for file in os.listdir("src"):
m = po.match(file)
if m:
# create the LANG/LC_MESSAGES subdir of "locale"
path = os.path.join(self.build_lib, NAME, "locale",
m.group(2), "LC_MESSAGES")
os.makedirs(path, exist_ok=True)
# use msgfmt.py to compile the po file
msgfmt.make(os.path.join("src", file),
os.path.join(path, m.group(1) + ".mo"))
_build.run(self)
setup(
name=NAME,
...
package_data = { "": [..., "locale/*/*/*.mo"]}, # ensure .mo file are copied
cmdclass = {"build": Builder},
)
In MANIFEST.in
:
...
include src/*
include tools_i18n/*
To use the translations at run time:
locpath = os.path.dirname(__file__)
lang = locale.getdefaultlocale()[0] # to get platform default language, or whatever...
tr = gettext.translation("argparse", os.path.join(locpath, "locale"),
[lang], fallback=True)
A full project using this method is available at https://github.com/s-ball/i18nparse
Last but not least, after a more in depth reading of the GNU gettext doc, I can say that gettext can process mo files whatever their endianness:
MO files of any endianness can be used on any platform. When a MO file has an endianness other than the platform’s one, the 32-bit numbers from the MO file are swapped at runtime. The performance impact is negligible.
Upvotes: 3