Reputation: 8064
I work on a Python project that in one place callse Julia's code, and in other uses OpenCV.
Unfortunately, pyJulia prefers Python interpreter to be dynamically linked to the libpython
. (I know I can build a custom Julia system image, but I fear the build delays when I want to test a development version of my Julia code from Python.)
What has worked so far, is using Spack instead of Conda. Python built by Spack has a shared libpython
and Spack's repository does include a recent opencv
.
Unfortunately, contrary to Conda, Spack is designed around a paradigm of compiling everything, rather than downloading binaries. The installation time of opencv
is well over 1 hour, which barely is acceptable for a one-off install in the development environment, but is dismayingly long to build a Docker image.
So I have a thought: maybe it is possible to integrate my own Python with the rest of the Conda ecosystem?
Upvotes: 2
Views: 155
Reputation: 59847
This isn't a full solution, but Spack does support binary packages, as well as GitLab build pipelines to build them in parallel and keep them updated. What it does not have (yet) is a public binary mirror, so that you could install these things very quickly from pre-existing builds. That's in the works.
So, if you like the Spack approach, you can set up your own binary caches and automated builds for your dev environment.
I am not sure what the solution would be with Conda. You could make your own conda-forge packages, but I think if you deviate from the standard ones, you may end up reimplementing a lot of packages to support your use case. On the other hand, they may accept patches to make your particular configuration work.
Upvotes: 1