Reputation: 284
I am trying to finetune an Spacy NER model using BERT a
#Train the data
!python -m spacy train -g 0 config_spacy_bert.cfg --output ./output --paths.train ./train.spacy --paths.dev ./train.spacy
Batch size in the config file is 2 and I am getting an error
RuntimeError: CUDA out of memory. Tried to allocate 18.00 MiB (GPU 0; 1.96 GiB total capacity; 958.13 MiB already allocated; 11.25 MiB free; 978.00 MiB reserved in total by PyTorch)
How is it possible to remove this error
Upvotes: 0
Views: 196
Reputation: 15593
You don't have enough GPU memory. You need to get a bigger GPU, or not use BERT, or use a smaller model.
The recommended GPU memory size with spaCy is 10GB; you can sometimes make do with 8GB or slightly less, but it looks like you only have 2GB, which is just not enough.
Upvotes: 1