Reputation: 103
I had saved a trained model in tensorflow with the command:
saver = tf.train.Saver()
ss = saver.save(sess, '/tmp/new_trained_model.ckpt')
Then,I load the model with the command:
imported_meta = tf.train.import_meta_graph("/tmp/new_trained_model.ckpt.meta")
imported_meta.restore(sess, tf.train.latest_checkpoint(checkpoint_dir="/tmp/,latest_filename="checkpoint"))
Now,to evaluate accuracy the following function is used:
correct_prediction = tf.equal(tf.argmax(logits, 1), tf.argmax(one_hot_y, 1))
#logits come from the model,there is no error,so didn't post that code
accuracy_operation = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
#saver = tf.train.Saver()
def evaluate(X_data, y_data):
num_examples = len(X_data)
total_accuracy = 0
sess = tf.get_default_session()
for offset in range(0, num_examples, BATCH_SIZE):
batch_x, batch_y = X_data[offset:offset+BATCH_SIZE], y_data[offset:offset+BATCH_SIZE]
accuracy = sess.run(accuracy_operation, feed_dict={x: batch_x, y: batch_y, keep_prob: 1.0})
total_accuracy += (accuracy * len(batch_x))
return total_accuracy / num_examples
test_accuracy = evaluate(X_test, y_test)
But the above function gives the error:
FailedPreconditionError (see above for traceback): Attempting to use uninitialized value Variable_12
[[Node: Variable_12/read = Identity[T=DT_FLOAT, _class=["loc:@Variable_12"], _device="/job:localhost/replica:0/task:0/device:CPU:0"](Variable_12)]]
But,when I print the tensors from the graph,it shows the matrix of Variable_12:
from tensorflow.python.tools import inspect_checkpoint as chkp
chkp.print_tensors_in_checkpoint_file("/tmp/new_trained_model.ckpt", tensor_name='',all_tensor_names='', all_tensors=True)
Variable_12:(Showing only one variable from the output)
tensor_name: Variable_12
[[-0.1013797 -0.08079438 -0.05904691 ... -0.07798752 -0.08208387
-0.18532619]
[ 0.10919656 -0.06162841 -0.19453178 ... -0.03241748 0.1023232
0.07120663]
[-0.10920436 0.00233169 -0.08879709 ... -0.09918057 -0.02546161
0.00903581]
...
[ 0.13858072 0.13791025 -0.12322884 ... -0.15006843 0.00103891
0.06663229]
[-0.14043045 0.14039241 0.15048873 ... 0.07272678 0.00470365
0.0273346 ]
[-0.10976157 -0.10873327 -0.16460624 ... -0.16509598 0.1124685
-0.08858881]]
Can anyone please explain why the uninitialized error is being shown as the value is there which is confirmed by the inspect_checkpoint?
Thank you for your time.
Upvotes: 1
Views: 153
Reputation: 3643
The problem seems to be that you built two computation graphs.
First you mentioned that you "executed the part where the model architecture was defined". This creates the computation graph for your model.
Then, you also did
imported_meta = tf.train.import_meta_graph("/tmp/new_trained_model.ckpt.meta")
This creates the second computation graph for your model.
Depending on how exactly you executed these, the "computation graphs" can be in one or in two separate "Graph" objects. In any case, the imported_meta.restore
, initialized the variables of the imported (second) computation graph, but you called session.run()
to compute the tensor from the first computation graph. Nobody has initialized the variables in the first computation graph.
The fix is not to import the (meta) graph, if you already created your graph. As long as the variable names and shapes did not change, you can just use the Saver
to restore the values of the variables without creating any more variables or operations.
Upvotes: 2