hfhc2
hfhc2

Reputation: 4421

Package a pre-built python extension

I am working on a C library (using cmake as a build system) and a corresponding python extension written in cython.

The build process is conducted by cmake, which calls the cython executable to generate a C file. The file is compiled into a python_library.so which links against the native library.so and other dependencies.

The library works as expected, I can set the PYTHONPATH to the build directory, run python and import and execute the wrapped python code.

What remains is the question about how to install / package the python module. As far as I know, the recommended method to create python packages is to use setuptools / distutils inside a setup.py file.

It is of course possible to define a C Extension (optionally using cython) inside the setup.py file. However, I want the compilation to be handled by cmake (it involves some dependent libraries etc.)

So basically, I would like to tell python that the whole package is defined by an existing python_library.so file. Is that at all possible?

Note: there is a related question. But the OP has already figured out how to package the extension.

Upvotes: 1

Views: 1736

Answers (1)

ead
ead

Reputation: 34367

Obviously, this is not the most robust way to distribute python-packages as it will not work for different OSes or may lead to strange results if there is Python-version mismatch - but nevertheless it is possible.

Let's consider following folder structure:

/
|--- setup.py
|--- my_package
        |-------  __init__.py
        |-------  impl.pyx          [needed only for creation of impl.so]
        |-------  impl-XXX.so       [created via "cythonize -i impl.pyx"]

With the following content:

__init__.py:

from .impl import foo

impl.pyx:

def foo():
    print("I'm foo from impl")

setup.py:

from setuptools import setup, find_packages


kwargs = {
      'name':'my_package',
      'version':'0.1.0',
      'packages':find_packages(),

       #ensure so-files are copied to the installation:
      'package_data' : { 'my_package': ['*.so']},
      'include_package_data' : True,
      'zip_safe' : False  
}


setup(**kwargs)

Now after calling python setup.py install, the package is installed and can be used:

>>> python -c "import my_package; my_package.foo()"
I'm foo from impl

NB: Don't call the test from the folder with the setup file, because then not the installed but local version of my_package can be used.


You might want to have different so-binaries for different Python versions. It is possible to have the same extension compiled for different Python versions - you have to add the right suffix to the resulting shared library, for example:

  • impl.cpython-36m-x86_64-linux-gnu.so for Python3.6 on my linux machine
  • impl.cpython-37m-x86_64-linux-gnu.so for Python3.7
  • impl.cp36-win_amd64.pyd on windows

One can get the suffix for extensions on the current machine using

>>> import importlib
>>> importlib.machinery.EXTENSION_SUFFIXES
['.cp36-win_amd64.pyd', '.pyd']

Upvotes: 1

Related Questions