kai
kai

Reputation: 743

Installing a specific PyTorch build (f/e CPU-only) with Poetry

I've recently found poetry to manage dependencies. In one project, we use PyTorch. How do I add this to poetry?

We are working on machines that have no access to a CUDA GPU (for simple on the road inferencing/testing) and workstations where we do have access to CUDA GPUs. Is it possible to use poetry to ensure every dev is using the same PyTorch version?

There seems to be no obvious way to decide which PyTorch version to install. I thought about adding the different installation instructions as extra dependencies, but I failed to find an option to get the equivalent settings like:

pip3 install torch==1.3.1+cpu torchvision==0.4.2+cpu -f https://download.pytorch.org/whl/torch_stable.html

I would be fine with setting the total path to the different online wheels, like: https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl

But I would rather not but them in git directly... The closest option I've seen in poetry is either downloading them manually and then using file = X command.

Upvotes: 64

Views: 58057

Answers (7)

tsvikas
tsvikas

Reputation: 17606

Solution using poetry:

Since poetry 1.2, you can do this:

poetry source add --priority explicit pytorch_cpu https://download.pytorch.org/whl/cpu
poetry add --source pytorch_cpu torch torchvision

and it will install from the specified index-url. This also works with a CUDA-specific source, like https://download.pytorch.org/whl/cu118.

see the poetry documentation for more information

Solution using uv (Update 2024)

You can use the uv tool to install using a specific index:

uv add torch torchvision --index pytorch_cpu=https://download.pytorch.org/whl/cpu

It is recommended to add explicit = true in the pyproject.toml file, for each index that should only be used for packages that explicitly specify it.

see the uv documentation for more information

Upvotes: 60

Bruce Dickie
Bruce Dickie

Reputation: 51

I found this worked for me (with GOU running CUDA 12.1 and Python 3.11 on Windows) in my toml file. You can find other versions of PyTorch (torch, torchaudio, torchvideo) to run at https://download.pytorch.org/whl/cu121 that are for CPU only, running on Windows, Linux and Mac

[tool.poetry.dependencies]
torch = {url = "https://download.pytorch.org/whl/cu121/torch-2.2.1%2Bcu121-cp311-cp311-win_amd64.whl"}
torchaudio = {url = "https://download.pytorch.org/whl/cu121/torchaudio-2.2.1%2Bcu121-cp311-cp311-win_amd64.whl"}
torchvision = {url = "https://download.pytorch.org/whl/cu121/torchvision-0.17.1%2Bcu121-cp311-cp311-win_amd64.whl"}

Upvotes: 0

DataMinion
DataMinion

Reputation: 437

In late 2021, utilizing markers and multiple constraints should work.

$ poetry --version
Poetry version 1.1.11
# pyproject.toml
[tool.poetry.dependencies]
python = "~3.9"
torch = [
  {url = "https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-linux_x86_64.whl", markers = "sys_platform == 'linux'"},
  {url = "https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-win_amd64.whl", markers = "sys_platform == 'win32'", }
]
numpy = "^1.21.4"

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
$ poetry install
The currently activated Python version 3.8.12 is not supported by the project (~3.9).
Trying to find and use a compatible version. 
Using python3.9 (3.9.9)
Creating virtualenv machine-learning in /home/redqueen/machine_learning/.venv
Updating dependencies
Resolving dependencies... (36.0s)

Writing lock file

Package operations: 3 installs, 0 updates, 0 removals

  • Installing typing-extensions (4.0.1)
  • Installing numpy (1.21.4)
  • Installing torch (1.10.0+cpu https://download.pytorch.org/whl/cpu/torch-1.10.0%2Bcpu-cp39-cp39-linux_x86_64.whl)

NOTE: Numpy has to be listed. Otherwise you'll get an import error.

Without numpy:

$ python
Python 3.9.9 (main, Nov 23 2021, 00:34:08) 
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
/home/redqueen/machine_learning/.venv/lib/python3.9/site-packages/torch/package/_directory_reader.py:17: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at  ../torch/csrc/utils/tensor_numpy.cpp:68.)
  _dtype_to_storage = {data_type(0).dtype: data_type for data_type in _storages}
>>> quit()

With numpy:

$ python
Python 3.9.9 (main, Nov 23 2021, 00:34:08) 
[GCC 9.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.cuda.is_available()
False
>>> quit()

Reference:

https://python-poetry.org/docs/dependency-specification/#python-restricted-dependencies

Disclaimer

I do not have a Windows (or Mac) to test this on.

Upvotes: 12

bennyl
bennyl

Reputation: 2956

There is a fork that I am maintaining called relaxed-poetry It is a very young fork but it supports what you want with the following configuration:


# pyproject.toml

[tool.poetry.dependencies]
python = "^3.8"
torch = { version = "=1.90+cu111", source = "pytorch" }

[[tool.poetry.source]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cu111/"
secondary = true

Check it if you like, it can be installed side by side with poetry.

Upvotes: 5

Kirell
Kirell

Reputation: 9818

After spending a couple of hours on this issue, I found a "solution" by combining Poetry and pip just for PyTorch. You don't need to specify the wheel URLs directly and thus remain cross-platform.

I'm using Poe The Poet, a nice task runner for Poetry that allows to run any arbitrary command.

[tool.poetry.dev-dependencies]
poethepoet = "^0.10.0"

[tool.poe.tasks]
force-cuda11 = "python -m pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 -f https://download.pytorch.org/whl/torch_stable.html"

You can run:

poetry install

and then:

poe force-cuda11  # relies on pip and use PyTorch wheels repo

Upvotes: 19

Antiez
Antiez

Reputation: 957

An updated solution from this issue in the Poetry github:

poetry add torch --platform linux --python "^3.7"

Upvotes: 13

GilZ
GilZ

Reputation: 6477

Currently, Poetry doesn't have a -f option (there's an open issue and an open PR), so you can't use the pip instructions. You can install the .whl files directly:

poetry add https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl

or add the dependency directly to your .toml file:

[tool.poetry.dependencies]
torch = { url = "https://download.pytorch.org/whl/torch_stable.html/cpu/torch-1.3.1%2Bcpu-cp36-cp36m-win_amd64.whl" }

Upvotes: 30

Related Questions