Marco Ramos
Marco Ramos

Reputation: 127

Strange Cuda out of Memory behavior in Pytorch

Edit: SOLVED- Problem relied on the number of workers, lowered them, problem solved

I am using a 24GB Titan RTX and I am using it for an image segmentation Unet with Pytorch,

it is always throwing Cuda out of Memory at different batch sizes, plus I have more free memory than it states that I need, and by lowering batch sizes, it INCREASES the memory it tries to allocate which doesn't make any sense.

here is what I tried:

Image size = 448, batch size = 8

Image size = 448, batch size = 6

is says it tried to allocate 3.12GB and I have 19GB free and it throws an error??

Image size = 224, batch size = 8

Image size = 224, batch size = 6

reduced batch size but tried to allocate more ???

Image size = 224, batch size = 4

Image size = 224, batch size = 2

Image size = 224, batch size = 1

Even with stupidly low image sizes and batch sizes...

Upvotes: 5

Views: 6972

Answers (1)

Marco Ramos
Marco Ramos

Reputation: 127

SOLVED- Problem relied on the number of workers, lowered them, problem solved

Upvotes: 3

Related Questions