Reputation: 3974
Currently, I am writing custom geospatial scripts / modules in pure python. These are distributed to a moderate sized user base internal to the company I write for. Users are on multiple varieties of linux, OS X, and windows, so I need to be able to support installation to an array of systems. My broad list of dependencies are:
GDAL Matplotlib with Basemap NumPy SciPy PIL Python 2.7+
How have other users maintained a library of tools for such a diverse array of users when they are the point of contact for installation? Currently under considerations:
Create an LUbuntu VM with everything that we need and either run it over the network or run it locally on a user's machine.
Modify FWTools and add the components that we need. FWTools functions by setting a local, self contained environment when a user calls a script.
Create an implementation like PythonEveryWhere where we let the user access a bash shell / python interpreter via a browser. Data is pulled over the network from user's shared drives and output populated to that drive.
Pip / easy_install / Py2EXE / Py2App have been tested but are either not great across multiple platforms (GDAL is a major issue) or create quite large distributable given the number of dependencies.
What other implementations work for you? Am I missing an obvious distribution technique?
Upvotes: 1
Views: 129
Reputation: 6768
It slurps entire Python runtime, your script and dependencies, etc.
The downside is the app startup takes some time as binary has to unpack everything into memory and start from there.
The gain is obviously big: no need to maintain fragile Python installation + dependencies, as binary is self-contained. No installation, nothing.
Upvotes: 1