deepdebugging
deepdebugging

Reputation: 115

Tensorflow: Not restoring but only saving trainable variables

Let's say I have a model with Y layers.

I am trying to restore the model with setting Y-1 layers to trainable=False, so I insert all Y-1 layers(variable names) into var_list when defining tf.train.Saver(var_list=list_of_Y-1_layers) so they can be restored.

I would like to not restore the last layer, which I would like to train myself, so if I put it var_list it gets restored and if I don't put it there, it doesn't save inside the checkpoint during training.

Does this variable gets saved elsewhere ? Or am I doing something wrong for saving/restoring?

Side note: To check if a trainable variable is saved or not, I use the function inspect_checkpoint(), which is defined in tensorflow/tensorflow/python/tools/inspect_checkpoint.py

Upvotes: 0

Views: 761

Answers (2)

Philippe C
Philippe C

Reputation: 677

You can create two objects, one for saving, the other for restoring:

#used to restore:
saver_restore = tf.train.Saver(var_list=list_of_Y-1_layers)
#used to save, will save all variables
saver_save = tf.train.Saver()

Upvotes: 1

suharshs
suharshs

Reputation: 1088

You can save your entire model without specifying a var_list. This will save all of the variables in a checkpoint. Then when you restore, you can specify var_list to the restore Saver to only restore your desired subset of layers.

Sources:

https://www.tensorflow.org/programmers_guide/saved_model#choosing_which_variables_to_save_and_restore

Upvotes: 0

Related Questions