Mehdi
Mehdi

Reputation: 1296

memory leakage in TensorFlow 2.2

I use TensorFlow 2.2 and I am trying to find best hyper-parameters for my model/dataset. So, I use model.fit inside a for loop where I change learning rate. However, I have memory leakage. Here is my code:

for i in range(1,500):
    LR=10**uniform(-5,-3)
    model.compile(loss='mean_squared_error', optimizer=tensorflow.keras.optimizers.Adam(learning_rate=LR,beta_1=0.9,beta_2=0.999,epsilon=1e-07,amsgrad=False,name="Adam"), metrics=['mse'])
    model.fit(x_train,y_train,validation_data=(x_test,y_test),verbose=2,batch_size=32, epochs=30) 

How can I solve memory leakage problem?

Upvotes: 1

Views: 651

Answers (1)

ravikt
ravikt

Reputation: 1058

Since your compile and fit are inside loop, It would help to create new TF graph and session at every iteration. Consider clearing clutter from model and layers from previous iteration. This can be done using tf.keras.backend.clear_session().

for i in range(1,500):
    LR=10**uniform(-5,-3)
    model.compile(loss='mean_squared_error', optimizer=tensorflow.keras.optimizers.Adam(learning_rate=LR,beta_1=0.9,beta_2=0.999,epsilon=1e-07,amsgrad=False,name="Adam"), metrics=['mse'])
    model.fit(x_train,y_train,validation_data=(x_test,y_test),verbose=2,batch_size=32, epochs=30)
    tf.keras.backend.clear_session()

Alternatively, you can use model.train_on_batch inside the loop.

Upvotes: 1

Related Questions