Reputation:
From what I see, most people seem to be initializing an entire model, and sending the whole thing to the GPU. But I have a neural net model that is too big to fit entirely on my GPU. Is it possible to keep the model saved in ram, but run all the operations on the GPU?
Upvotes: 0
Views: 566
Reputation: 489
I do not believe this is possible. However, one easy work around would be to split you model into sections that will fit into gpu memory along with your batch input.
Repeat 1 through 3 until you reach your models final output.
Upvotes: 1