Khizar
Khizar

Reputation: 45

Keras: What if i recompile my model after training few epochs

I have a model and i want to train it with learning_rate = 0.8 for few epochs, then set learning rate = 0.4 & continue training. But as learning rate is being set when compiling model... So what will happen to model/weights if i recompile it after few epochs?

Below is my code: P.S (my learning rate is dynamic)

lr = 0.04
adam = Adam(lr=lr)
weight_factor = 10
models.compile(
    optimizer=adam,
"kullback_leibler_divergence"
    loss = {'W1':kl_divergence,'age':mae},
    metrics={"age": mae,"W1":'accuracy'},
    loss_weights={'W1':weight_factor, 'age': 1}
)

dynamic learning rate Callbacks

callbacks = [
  ReduceLROnPlateau(monitor='val_age_mean_absolute_error', 
                    factor = 0.5, 
                    patience = 7,
                    min_delta = 0.01, 
                    cooldown = 2,
                    min_lr = 0.0001,
                    mode = 'min')
]

Training

epochs=35
history = models.fit(train_gen, steps_per_epoch=len(trainset) / batch_size, epochs=epochs, callbacks=callbacks, validation_data=validation_gen, validation_steps=len(testset) / batch_size * 3)

Upvotes: 0

Views: 1412

Answers (2)

Eddy-Python
Eddy-Python

Reputation: 77

You can recompile a model without loosing the weights. The answer of faheem is wrong.

Recompiling a model is commonly used in transfer learning and fine tuning of models and described in the official documentation of Tensorflow.

https://www.tensorflow.org/guide/keras/transfer_learning

Basically transfer learning would not work, if recompiling the model would remove the weights.

Upvotes: 0

faheem
faheem

Reputation: 634

when you recompile model you weights are reset to random.

so you should save weights using model.save_weights('weights.h5') then compile model, after which load weights model.load_weights('weights.h5')

Upvotes: 3

Related Questions