Yanghoon
Yanghoon

Reputation: 572

Are there any way to add new variables to existing checkpoint by force in tensorflow?

Assume that there is a deep learning model and a checkpoint with pre-trained weights. What I want to do is to fine-tune the pre-trained model. However, I found that I have to fine-tune the model with a new ADAM optimizer which is not defined in the existing model graph. Since ADAM optimizer itself has some trainable_variables, I have to add those to the existing checkpoint if I want to fine-tune the model based on the checkpoint. Are there any way to add new variables to existing checkpoint by force in tensorflow?

Upvotes: 0

Views: 153

Answers (1)

Sharvil Nanavati
Sharvil Nanavati

Reputation: 46

You can use tf.compat.v1.train.warm_start to fine tune your new model with weights from a previously-trained model.

Normally you'd restore variables using tf.train.Saver or tf.train.Checkpoint. If you use tf.compat.v1.train.warm_start, it'll restore the variables that your new model has in common with the old model but it'll leave it up to you to initialize any new variables you've added.

Upvotes: 0

Related Questions