Reputation: 1008
The tf.equal()
function seems to be weired when we apply a CNN network. In the case below, tf.equal()
returns incorrect result.
with tf.Graph().as_default():
images, labels = inputs("./test_data", [64, 64], 10, True)
logits = inference(images, 2, 1.0)
acc = accuracy(logits, labels)
saver = tf.train.Saver()
#predict_image(saver, logits)
eval_once(saver, logits, acc, labels)
def eval_once(saver, logits, acc, labels):
with tf.Session() as sess:
ckpt = tf.train.get_checkpoint_state("./model/")
if ckpt and ckpt.model_checkpoint_path:
saver.restore(sess, ckpt.model_checkpoint_path)
print "Model Loaded!"
else:
print "Model Not Found!"
return
coord = tf.train.Coordinator()
threads = tf.train.start_queue_runners(sess, coord = coord)
l = tf.argmax(labels,1)
p = tf.argmax(logits,1)
print "labels"
print sess.run(l)
print "preds"
print sess.run(p)
print sess.run(tf.equal(l, p))
print "%.5f" % sess.run(acc)
coord.request_stop()
coord.join(threads, stop_grace_period_secs = 10)
In the code, inputs would read the images by file_queue
and inference
defines our CNN network. The output of inference
is the logits of the last fully connected layer. In my case, the label is one_hot encoded and there are 2 classes, so it should be [1,0] or [0,1].
The result is below:
labels
[0 0 1 0 1 0 0 0 1 0]
preds
[0 1 0 1 0 0 1 0 1 1]
[ True True True True True True True True True True]
0.90000
From the result we find out that labels and preds are not equal at index 1 (start from index 0). However, the tf.equal()
gives us True
. It is as same as index 2 and index 3.
Then I test tf.equal()
on some other cases and the result is correct.
So, how could it happen?
(I used a file_queue to read the images and apply tf.train.batch
or tf.train.shuffle
to create the batch for training and testing. Just like the example of cifar 10 in tensorflow examples)
Upvotes: 0
Views: 381
Reputation: 1008
After the thinking, there is one possibility that each time I call sess.run()
the file_queue
would read shuffled images and labels so that the result is weired.
Upvotes: 1