How to clean garbage from CUDA in Pytorch?

I teached my neural nets and realized that even after torch.cuda.empty_cache() and gc.collect() my cuda-device memory is filled. In Colab Notebooks we can see the current variables in memory, but even I delete every variable and clean the garbage gpu-memory is busy. I heard it's because python garbage collector can't work on cuda-device. Please explain me, what should I do?

Upvotes: 6

Views: 7107

Answers (2)

razimbres
razimbres

Reputation: 5005

You can do this:

import gc
import torch
gc.collect()
torch.cuda.empty_cache()

Upvotes: 7

DvdG
DvdG

Reputation: 794

For me I have to delete the model before emptying the cache:

del model
gc.collect()
torch.cuda.empty_cache()

then you can check memory is freed using 'nvidia-smi'.

Upvotes: 3

Related Questions