Zorglub29
Zorglub29

Reputation: 8821

tensorflow restore only variables

I do some training in Tensorflow and save the whole session using a saver:

# ... define model

# add a saver
saver = tf.train.Saver()

# ... run a session
    # ....
    # save the model
    save_path = saver.save(sess,fileSaver)

It works fine, and I can successfully restore the whole session by using the exact same model and calling:

saver.restore(sess, importSaverPath)

Now I want to modify only the optimizer while keeping the rest of the model constant (the computation graph stays the same apart from the optimizer):

# optimizer used before
# optimizer = tf.train.AdamOptimizer
#    (learning_rate = learningRate).minimize(costPrediction)
# the new optimizer I want to use
optimizer = tf.train.RMSPropOptimizer
    (learning_rate = learningRate, decay = 0.9, momentum = 0.1,
    epsilon = 1e-5).minimize(costPrediction)

I also want to continue the training from the last graph state I saved (i.e., I want to restore the state of my variables and continue with another training algorithm). Of course I cannot use:

saver.restore

any longer, because the graph has changed.

So my question is: is there a way to restore only variables using the saver.restore command (or even, maybe for later use, only a subset of variables), when the whole session has been saved? I looked for such feature in the API documentation and online, but could not find any example / detailed enough explanations that could help me get it to work.

Upvotes: 4

Views: 3209

Answers (3)

Frisa Shinezeus
Frisa Shinezeus

Reputation: 39

I see the same problem. Inspired by keveman' s answer. My solution is:

  1. Define your new graph, (here only new optimizer related variables are different from the old graph).

  2. Get all variables using tf.global_variables(). This return a var list I called g_vars.

  3. Get all optimizer related variables using tf.contrib.framework.get_variables_by_suffix('some variable filter'). The filter may be RMSProp or RMSPRrop_*. This function returns a var list I called exclude_vars.

  4. Get the variables in g_vars but not in exclude_vars. Simply use

    vars = [item for item in g_vars if item not in exclude_vars]

these vars are common vars in both new and old graph, which you can restore from old model now.

Upvotes: 1

Falcon
Falcon

Reputation: 1367

you could recover the original Saver from a MetaGraph protobuf first and then use that saver to restore all old variables safely. For a concrete example, you can take a look at the eval.py script: TensorFlow: How do I release a model without source code?

Upvotes: 0

keveman
keveman

Reputation: 8487

It is possible to restore a subset of variables by passing the list of variables as the var_list argument to the Saver constructor. However, when you change the optimizer, additional variables may have been created (momentum accumulators, for instance) and variable associated with the previous optimizer, if any, would have been removed from the model. So simply using the old Saver object to restore will not work, especially if you had constructed it with the default constructor, which uses tf.all_variables as the argument to var_list parameter. You have to construct the Saver object on the subset of variables that you created in your model and then restore would work. Note that, this would leave the new variables created by the new optimizer uninitialized, so you have to explicitly initialize them.

Upvotes: 3

Related Questions