Reputation: 41
I'm developing several different Python packages with my team. Say that we have ~/src/pkg1
, ~/src/pkg2
, and ~/src/pkg3
. How do we add these to PYTHONPATH
without each of us having to manage dot-files?
We could add, say, ~/src/site/sitecustomize.py
, which is added once to PYTHONPATH
, but is it "guaranteed" that there won't be a global sitecustomize.py
.
virtualenv seems like the wrong solution, because we don't want to have to build/install the packages after each change.
Upvotes: 3
Views: 2101
Reputation: 1223
You have a lot of options...
You could centralize the management of dotfiles with a centralized repository and optionally with version control. I use a Dropbox folder named dotfiles
but many people use github
or other services like that to manage dotfiles.
If you do that, you will guarantee every people on your development team to share some dotfiles. So you could define a dotfile say .python_proys
which export the appropriate PATH
and PYTHONPATH
which by convention every developer should source
in their environment.
Suppose pkg1 is only an script, pkg2 is an script and also a module and pk3 is only a module. Then, python_proys Example:
export PATH=$PATH:~/src/pkg1:~/src/pkg2
export PYTHONPATH=$PYTHONPATH:~/src/pkg2:~/src/pkg3
And then, every developer have to source
this dotfile somewhere by convetion. Each one will do the way he like. One could source
the dotfile manually before using the packages. Another one could source
it in his .bashrc
or .zshenv
or whatever dotfile apply to him.
The idea is to have one centralized point of coordination and only one dotfile to maintain: the .python_proys dotfile.
You could define a directory in your home, like ~/dist
(for modules) and ~/bin
(for scripts) and set symbolic links there to the specific pakages in ~/src/
, and make every developer have this PATH
and PYTHONPATH
setting:
export PATH=$PATH:~/bin
export PYTHONPATH=$PYTHONPATH:~/dist
So, using the same example at Why not dotfiles?, where pkg1 is only an script, pkg2 is an script and also a module and pkg3 is only a module, then you could symlink like:
cd ~/bin
ln -s ../src/pkg1
ln -s ../src/pkg2
cd ~/dist
ln -s ../src/pkg2
ln -s ../src/pkg3
Those commands could be do automatically with an script. You could write a bootstrap script, or simply copy and paste the commands and save it in a shell script. In any way, maintain it and centralize it the same way i explain it before.
This way the .dotfiles will not change, only the script defining the symlinks.
Upvotes: 2
Reputation: 174748
First, you don't add a python module to PYTHONPATH
, you just add the path component.
If you want all your team to be working on some python package, you can install the package as editable with the -e
option in a virtual environment.
This way you can continue development and you don't have to mess with the PYTHONPATH
. Keep in mind that the working directory is always included in the PYTHONPATH
, so unless you have an external requirement; you don't need a virtual environment, just the source in your working directory.
Your workflow would be the following:
.pth
file, to modify your PYTHONPATH
.This would be my preferred option. If you have a standard layout across your projects, you can distribute a customized bootstrap script which will create the environment, and then adjust the PYTHONPATH
automatically. Share this bootstrap script across the team, or add it as part of the source repository.
Upvotes: 0
Reputation: 123531
I suggest looking into creating a name.pth path configuration file as outlined in thesite
module's documentation. These files can hold multiple paths that will be added tosys.path
and be easily edited since they're simply text files.
Upvotes: 0
Reputation: 3315
I assume that your other modules are at predictable path (relative to $0)
We can compute absolute path of $0
os.path.realpath(sys.argv[0])
then arrive at your module path and append it
sys.path.append(something)
Upvotes: -1