Ken
Ken

Reputation: 17

pytorch memory suddenly large and small

When I am using pytorch1.8.1 to train the model, the memory size of the model becomes large and small, resulting in frequent reports of out of memory. What is the general reason for this large and small memory?

I tried using torch.cuda.empty_cache(), but still have this problem. memory change graph my model dataset getitem

Upvotes: -2

Views: 49

Answers (1)

Jiu_Zou
Jiu_Zou

Reputation: 571

the cuda is malloc the ram and free it during training.

using the smaller batch_size will solve the problem, and you should check the use of torch.tensor in cuda, avoiding big tensor, or you can del big torch.tensor.cuda after using it.

Upvotes: 1

Related Questions