Leander
Leander

Reputation: 1395

Summary for the a specific branch

I have a tensorflow graph that has a complicated loss function for the training, but a simpler one for evaluation (they share ancestors). Essentially this

train_op = ... (needs more things in feed_dict etc.)
acc = .... (just needs one value for placeholer)

to better understand what's going on, I added summaries. But calling

merged = tf.summary.merge_all()

and then

(summ, acc) = session.run([merged, acc_eval], feed_dict={..})

tensorflow complains that values for placeholders are missing.

Upvotes: 0

Views: 53

Answers (1)

Lam
Lam

Reputation: 340

As far as I understand your question, to summary a specific tensorflow operation, you should run it specifically.

For example:

# define accuracy ops
correct_prediction = tf.equal(tf.argmax(Y, axis=1), tf.argmax(Y_labels, axis=1))  
accuracy = tf.reduce_mean(tf.cast(correct_prediction, dtype=tf.float32))  

# summary_accuracy is the Summary protocol buffer you need to run, 
# instead of merge_all(), if you want to summary specific ops
summary_accuracy = tf.summary.scalar('testing_accuracy', accuracy) 

# define writer file
sess.run(tf.global_variables_initializer())
test_writer = tf.summary.FileWriter('log/test', sess.graph)

(summ, acc) = sess.run([summary_accuracy, accuracy], feed_dict={..})
test_writer.add_summary(summ)

Also, you can use tf.summary.merge(), which is documented here.
Hope this help !

Upvotes: 1

Related Questions