How to restore variables of a particular scope from a saved checkpoint in tensorflow?

import tensorflow as tf
saver = tf.train.Saver() 
saver.restore(...)

But saver.restore only has options to restore the entire graph. I would like to restore only those variables that are in a specific scope.

Thanks in advance!

Upvotes: 3

Views: 7164

Answers (1)

user2781994
user2781994

Reputation: 469

Assume you have Google's model of InceptionNet in scope InceptionV1 and you want to load it except for the last layer in scope InceptionRetrained you want to retrain.

Assuming you already started retraining the last layer and you created last_layer.ckpt file by saver2.save(session, 'last_layer.ckpt'), here is how to restore the net from both checkpoints.

saver1 = tf.train.Saver(var_list=tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope='InceptionV1'))
saver1.restore(session, 'inception_model_from_google.ckpt')

saver2 = tf.train.Saver(var_list=tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope='InceptionRetrained'))
saver2.restore(session, 'last_layer.ckpt')

If you are retraining only the last layer, don't forget to disable propagation of the gradient up the network (saves time) by calling the optimizer with var_list argument.

tf.train.Optimizer(0.0001).minimize(
            loss, var_list=tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, scope='Inceptionretrained'))

Upvotes: 4

Related Questions