xyz
xyz

Reputation: 172

Simultaneous training and testing in Tensorflow

I am trying to perform training and testing of a neural network in Tensorflow in the same script, same graph, same session. I read that it is possible, however when I look at the accuracy/loss results from the training and testing ops, it seems as if both ops are just a continuation of the training process somehow. E.g. train.acc. will end the epoch at .84, then testing will start at 0.84, end at 0.87, and then training will resume with an acc. of 0.87...

My code is constructed like this:

1) defining calculations for acc and loss

calc_loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=labels, logits=logits)
    acc, acc_op = tf.metrics.accuracy(labels=labels, predictions=predictions)
loss = tf.metrics.mean(calc_loss)

2) running the above in one and the same tf.session, e.g. for acc:

acc_value, acc_op_value = sess.run([acc, acc_op], feed_dict=feed_dict_train)
test_acc_value, test_acc_op_value = sess.run([acc, acc_op], feed_dict=feed_dict_test) 

the two dictionaries contain different data. My question is, do I need to define different ops for training and testing - is this where the training and testing get mixed up? Or is it impossible to test and train in one session? What would be a clean and simple way to go about this? A link to a code example that illustrates this would also be of help (as I am not managing to find anything that directly answers my question).

Upvotes: 3

Views: 377

Answers (1)

Alexandre Passos
Alexandre Passos

Reputation: 5206

The metrics in tf.metrics are stateful; they create variables to accumulate partial results in, so you shouldn't expect them to auto-reset. Instead use the metrics in tf.contrib.metrics or tf.keras.metrics and session.run the ops to reset them accordingly.

Upvotes: 1

Related Questions