neuront
neuront

Reputation: 9612

Python import modules: what is different between a file and a directory?

I have following files and directories in my project root directory

main.py
bar.py
foo \
    __init__.py
    alice.py
    bob.py

the files in directory foo are all empty files, and the content of bar.py is

alice = None
bob = None

and main.py is

import foo
import bar
print 'foo:', dir(foo)
print 'bar:', dir(bar)

When execute python main.py the output is

foo: ['__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__']
bar: ['__builtins__', '__doc__', '__file__', '__name__', '__package__', 'alice', 'bob']

Why there is no alice or bob in foo? And, what should I do other than

from foo import alice, bob

if I want to use alice and bob of the module foo, since there might be a lot of files in that folder?

EDIT

My question is not about the built-in function dir giving weird result. If I do this in main.py

import foo
foo.alice

An exception will occur: AttributeError: 'module' object has no attribute 'alice'

There seems no alice in foo? I think I have some problem understanding how to import a directory as a module.

Upvotes: 2

Views: 1124

Answers (4)

avasal
avasal

Reputation: 14854

The alice and bob that you are seeing in dir(bar) are the variables from bar.py

dir(foo) doesn't mean directory.

dir : If called without an argument, return the names in the current scope.
Else, return an alphabetized list of names comprising (some of) the attributes
of the given object, and of attributes reachable from it.
If the object supplies a method named __dir__, it will be used; otherwise
the default dir() logic is used and returns
. for a module object: the module's attributes.
. for a class object:  its attributes, and recursively the attributes
    of its bases.
. for any other object: its attributes, its class's attributes, and
    recursively the attributes of its class's base classes.

In main.py write:

import foo.alice
import foo.bob

this should give you alice and bob

Upvotes: 3

Ben
Ben

Reputation: 71440

Python doesn't automatically import modules of packages. It wants you to explicitly specify the name somewhere, either by using from foo import alice in main.py, or by having foo/__init__.py import alice, or by having foo/__init__.py include the __all__ list.

My answer on another SO question goes into a bit more detail about why this is.

Upvotes: 0

mvanveen
mvanveen

Reputation: 10028

It basically comes down to the difference between a module and a package. From the docs:

Packages are a way of structuring Python’s module namespace by using “dotted module names”. For example, the module name A.B designates a submodule named B in a package named A. Just like the use of modules saves the authors of different modules from having to worry about each other’s global variable names, the use of dotted module names saves the authors of multi-module packages like NumPy or the Python Imaging Library from having to worry about each other’s module names.

The purpose of the __init__.py file within a directory is to make it a package. This file provides a place specify the public facing interface for the package. There's two ways you can get alice and bob into foo:

1. Use __all__

In your __init__.py file, you can explicitly declare what modules you want to expose with __all__. The following will expose alice and bob.

`__all__` = ['alice', bob']

2. import alice and bob directly

Alternatively, importing the modules within the __init__.py file will also expose these modules.

It will also import these modules at initialization time, so whenever you import anything in foo they will also get imported.

Upvotes: 2

mattbornski
mattbornski

Reputation: 12543

# in foo/__init__.py:
__all__ = ['alice', 'bob']

would allow you to do this:

# in main.py
from foo import *

and import only those modules called out explicitly in all

Upvotes: 0

Related Questions