Leo
Leo

Reputation: 1139

TensorBoard - Plot loss from 2 networks trained simultaneously on the same graph

Is there a way to plot both training losses for two different networks being trained at the same time?

At the moment I use two FileWriter and save the summaries to two different directories:

writer_cnn  = tf.summary.FileWriter(os.path.join('log', 'cnn'))
writer_dann = tf.summary.FileWriter(os.path.join('log', 'dann'))
s_loss_cnn  = tf.summary.scalar('loss_class', loss_class_cnn)
s_loss_dann = tf.summary.scalar('loss_class', loss_class_dann)

And later in the code:

s_cnn  = sess.run(s_loss_cnn, feed_dict=feed_batch)
s_dann = sess.run(s_loss_dann, feed_dict=feed_batch)
writer_cnn.add_summary(s_cnn, global_step)
writer_dann.add_summary(s_dann, global_step)

But then when I fire TensorBoard I get two different graphs loss_class and loss_class_1. I've read at different places like here and there that creating two directories was the way to go. Am I missing something?

Upvotes: 0

Views: 605

Answers (1)

Salvador Dali
Salvador Dali

Reputation: 222561

You have not included your code, but I suspect that your problem is because you add all the operations to the same graph (default graph).

Try to create separate graphs and add them to your writer (graph parameter)

Something like this:

def graph1():
    g1 = tf.Graph()
    with g1.as_default() as g:
        # define your ops
    with tf.Session( graph = g ) as sess:
        # do all the stuff and write the writer

Create a similar function graph2() and then invoke them.

Upvotes: 0

Related Questions