Mimi Lazarova
Mimi Lazarova

Reputation: 51

Detach sentence-transformer model from GPU to cpu

I have trained a SentenceTransformer model on a GPU and saved it. Now I would like to use it on a different machine that does not have a GPU, but I cannot find a way to load it on cpu.

from sentence_transformers import SentenceTransformer

model_name = 'all-MiniLM-L6-v2'
model = SentenceTransformer(model_name, device='cuda')

Upvotes: 3

Views: 12501

Answers (3)

In SentenceTransformer, you dont need to say device="cpu" because when there is no GPU loaded then by default it understand to load using CPU.

For CPU:

model = SentenceTransformer(model_name)

For GPU:

model = SentenceTransformer(model_name, device='cude')

or you can load the model simply like:

model = SentenceTransformer(model_name)

and then:

model.to('cuda')

Upvotes: 0

Jonas
Jonas

Reputation: 51

You may need to install the CPU version of torch first.

pip3 install torch --index-url https://download.pytorch.org/whl/cpu

As shown here: https://pytorch.org/get-started/locally/

Note: With this version there is no need for the device argument either.

Upvotes: 2

Schopen Hacker
Schopen Hacker

Reputation: 322

Set device parameter to cpu.

FYI : device takes as values pytorch device (like cpu, cuda, cuda:0 etc.), By default it is set to None, checks if a GPU can be used.

from sentence_transformers import SentenceTransformer
model_name = 'all-MiniLM-L6-v2'
model = SentenceTransformer(model_name, device='cpu')

Upvotes: 7

Related Questions