user3496060
user3496060

Reputation: 846

How do i get the VALUES of trainable variables from a restored graph & checkpoint in tensorflow

I want to get the values of the variables from a trained model. I have a check point file and I can restore graphs and checkpoints and do inference with them just fine.

However, I'm finding it extremely difficult to figure out how to get the trainable variable values (like the weight and bias values, not names...i want the VALUES) after I restore the checkpoint and graph. I've read through Tensorflow documentation and there's lots of suggestions regarding "with variable_scope", "reuse = True", and "tf.get_variable("myvar") within the scope...etc, but I get errors stating either the variable already exists or it hasn't been initialized. tf.graphkeys only returns names...not values.

Upvotes: 0

Views: 2536

Answers (1)

dm0_
dm0_

Reputation: 2156

When loading a meta-graph TensorFlow also restores collections. There are several collections related to variables, for example you can get collection of all trainable variables this way:

# graph is a TensorFlow Graph
variables = graph.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES)

You can Session.run each variable in the returned list to get its value. The code below assumes that variables are already initialized (restored from a checkpoint):

# sess is a TensorFlow Session
values = [sess.run(v) for v in variables]

Upvotes: 2

Related Questions