Reputation: 393
I'm building a rest-api using the Django Python framework. I'm using many external python packages. I have created a python virtual environment (python -m venv venv
) and after activating the venv
environment (venv\Scripts\activate
), I installed the requests
package (python -m pip install requests
). Then I pushed my project to my git repo and cloned it onto another machine. When I tried to run my Django project, it asked me to install the requests
package again. Why or how can I permanently install packages into my python virtual environment or someplace else where I wouldn't have to install packages again on different machines? I'm looking for a solution similar to NodeJS - npm
of installing packages as all the packages are locally installed into the node_modules
folder of the project and you don't have to reinstall them on different machines. Thanks
Upvotes: 0
Views: 124
Reputation: 1365
The environment itself is not shareable in the way you specify. I'd recommend to use Docker for this use-case. If you create a docker image which has the correct dependencies, then you can easily operate in the same environment on different computers. The python venv cannot be used this way.
Nevertheless, if your requirements.txt
files specify package versions, then the venv
you create on the two machines should be relatively similar (depending of course on other parameters like the OS, python version, etc.).
Upvotes: 2