Reputation: 31
I am trying to train a model on around 4500 text sentences. The embedding is however heavy. The session crashes for number of training clauses > 350. It works fine and displays results till 350 sentences though.
Error message:
Your session crashed after using all available RAM.
Runtime type - GPU
I have hereby attached the screenshot of logs. I am considering training model by batches, but I am a newbie and finding it difficult to have my way around it. Any help will be appreciated. session logs attached
Upvotes: 3
Views: 3873
Reputation:
This is basically because of out of memory on Google colab. Google colab provides ~12GB Free RAM, to extend it 25 GB follow the instruction mentioned here.
Upvotes: 2