SkippyElvis
SkippyElvis

Reputation: 100

How to correctly package Python3 app with subpackages

I have a package, s3_backend, which contains a module api.py, and a subpackage util, which itself contains 5 modules. I would like to package it up and upload it to PyPI so people can pip install package and use the scripting api, api.py, in the top-level package. Both packages (s3_backend and 's3_backend.util') include _init__.py files.

The whole project is held in a directory names project with the following structure

project
    |- s3_backend (package)
        |- __init__.py
        |- util (package)
            |- __init__.py
            |- module1.py
            |...
            |- module5.py
        |- api.py
    |- setup.py

api.py itself imports modules1-4.

Locally, from within project, I can do import api api.func1()... With no errors. When I uploaded the package to TestPyPI, and installed the package in a virtualenv in a new directory, I run into problems that trace back to the import statement in api.py that imports modules1-4 from the subpackage s3_backend.util.

This line from util import module1, module2, module3, module4 Throws this error No module names "util" When I run this command from s3_backend import util

I can successfully

import s3_backend

and

help(s3_backend)

shows

api.py
util (package)

The contents of my setup.py are shown below. What is the proper way of handling the subpackages in my setup.py, and am I writing my import statements wrong?

For fixes, I have tried replacing

from util import ...

with

from .util import ... 

and

from s3_backend.util import ... 

but those caused problems locally.

Source code for setup.py:

# project/setup.py
from distutils.core import setup

setup(
  name='s3_backend',
  version='0.1.7',
  license='MIT',
  description='scripting api for file upload to s3',
  author='SkippyElvis',
  author_email='Skippy@Elvis.com',
  url='https://github.com/jackhwolf/s3_backend',
  keywords=['aws', 's3', 'file upload'],
  packages=['s3_backend', 's3_backend.util'],
  classifiers=[
    'Programming Language :: Python :: 3',
  ],
)

Import statement in api.py:

from util import module1, module2, module3, module4

Please let me know if there is anything else you need to help me out. Thanks!

SkippyElvis

Upvotes: 1

Views: 405

Answers (1)

stonecharioteer
stonecharioteer

Reputation: 1089

Could you try using setuptools instead? The find_packages function works wonderfully as long as you have an init.py file in each folder you'd want to include.

Here's what you'd need to use in the setup.py

# project/setup.py
from setuptools import setup, find_packages

setup(
  name='s3_backend',
  version='0.1.7',
  license='MIT',
  description='scripting api for file upload to s3',
  author='SkippyElvis',
  author_email='Skippy@Elvis.com',
  url='https://github.com/jackhwolf/s3_backend',
  keywords=['aws', 's3', 'file upload'],
  packages=find_packages(),
  classifiers=[
    'Programming Language :: Python :: 3',
  ],
)

Otherwise, you could macguyver a function to mimic find_packages. I wouldn't recommend that.

As for your imports, I am an advocate of using absolute, root-level imports. That enforces two things: it makes sure you're internally using your libraries the way your users will use it. Plus, it also makes sure you don't attempt to test your code from within some folder but instead invoke it through tests. The way it "enforces" that is, your absolute import wouldn't work from inside some folder. It would instead need to be invoked through an external file/function call.

Upvotes: 1

Related Questions