user10593135
user10593135

Reputation:

PyTorch: Is there a way to store model in CPU ram, but run all operations on the GPU for large models?

From what I see, most people seem to be initializing an entire model, and sending the whole thing to the GPU. But I have a neural net model that is too big to fit entirely on my GPU. Is it possible to keep the model saved in ram, but run all the operations on the GPU?

Upvotes: 0

Views: 566

Answers (1)

J_Heads
J_Heads

Reputation: 489

I do not believe this is possible. However, one easy work around would be to split you model into sections that will fit into gpu memory along with your batch input.

  1. Send the first part(s) of the model to gpu and calculate outputs
  2. Release the former part of the model from gpu memory, and send the next section of the model to the gpu.
  3. Input the output from 1 into the next section of the model and save outputs.

Repeat 1 through 3 until you reach your models final output.

Upvotes: 1

Related Questions