Reputation: 2917
I'm using Google Colab to train my model. After training, I want to change the model but I can't because there is not enough RAM for it. I tried to re-assign old model to None
but RAM used didn't decrease.
I don't want to close the session and start from the beginning. Is there any way to free up RAM used in google colab?
Upvotes: 4
Views: 22292
Reputation: 2014
I had this problem. I was looping through different models I was building and it helped me to clear the session from memory after each run, as per this other Stackoverflow contribution:
from tensorflow.keras import backend as K
K.clear_session()
For some other users this also helped:
tf.reset_default_graph()
It might also be, without you noticing, that your RAM gets exhausted because you are loading your data from a pandas dataframe. In such a case this might help you, too, more precisely adding the following lines under each loop cleared the memory in my case:
import gc
import pandas as pd
del(df)
gc.collect()
df=pd.DataFrame()
Upvotes: 5
Reputation: 21
Colab does not provide this feature to increase RAM now.
workaround that you can opt is to del all variables as soon as these are used. Secondly, try to dump your intermediate variable results using pickle or joblib libraries. so if the RAM crashes so you don't have to start all over again.
example:
from sklearn.externals import joblib
from google.colab import files
#you can save variable into file on colab files
joblib.dump(var, 'var.pkl')
#this will download file to your local downloads
files.download('var.pkl')
#reload your saved data.
var = joblib.load('var.pkl')
Upvotes: 2
Reputation: 27
For a work around to increase your RAM to 25 gigs you can run below code and wait for the notebook to popup the RAM increasing option. There you go, you increased RAM to 25GB.
d =[]
while(1):
d.append('1')
Upvotes: -2
Reputation: 31
Colab dosen't support this feature. The only option is to start all over again.
Upvotes: 0