sergulaydore
sergulaydore

Reputation: 307

How to interpret tensorboard graph in tensorflow?

I am trying to understand how tensorboard visualizes the graph. I am using a simple linear regression for this purpose. Here is my code:

# LINEAR REGRESSION IN TENSORFLOW

# generate points
import numpy as np 
import os
import time
import tensorflow as tf

num_points = 1000
vectors_set = []
for i in xrange(num_points):
    x1 = np.random.normal(0.0, 0.55)
    y1 = x1 * 0.1 + 0.3 + np.random.normal(0.0, 0.03)
    vectors_set.append([x1, y1])

with tf.name_scope('data') as scope:
    x_data = [v[0] for v in vectors_set]
    y_data = [v[1] for v in vectors_set]

# Cost function and gradient descent algorithm
with tf.name_scope('model') as scope:
    W = tf.Variable(tf.random_uniform([1], -1, 1), name = "W")
    b = tf.Variable(tf.zeros([1]), name = "b")
    z = tf.add(W * x_data, b, name = "z")

with tf.name_scope('loss') as scope:
    loss = tf.reduce_mean(tf.square(z - y_data))

optimizer = tf.train.GradientDescentOptimizer(0.5)
train = optimizer.minimize(loss)

# Running the algorithm
init = tf.initialize_all_variables()

sess = tf.Session()
sess.run(init)

timestamp = str(int(time.time()))
print timestamp
train_summary_writer = tf.train.SummaryWriter(
      os.path.join(
          "./", "summaries", timestamp), sess.graph)
train_summary_writer.add_graph(sess.graph)

Here is tensorboard visualization: enter image description here

My questions are:

  1. I did not define gradients in my graph. Does it come with tensorboard by default?
  2. Why are there 8 tensors going from loss to gradients? And why 5 tensors from model to gradients?
  3. I did not define a variable y. Does tensorboard automatically assign y to constants? How can I change it?
  4. Why does not my graph show arrows between ops?

Thank you very much!

Upvotes: 2

Views: 4087

Answers (1)

dandelion
dandelion

Reputation: 1782

  1. The gradients were added to your graph automatically when you created a tf.train.GradientDescentOptimizer

  2. Your code specifies that the GradientDescentOptimizer should minimize loss, which means that it depends on the loss. Also, to minimize the loss, it needs to update the weights in your model.

  3. I'm not sure about that; can you upload the graph definition? (You can get the graph def from the session.)

  4. We've disabled the arrows when we added in the tensor shapes, but a lot of people have asked for them, so we will put it back.

BTW, right now you are inlining your data into the model, which isn't a good pattern. In the block:

with tf.name_scope('data') as scope:
  x_data = [v[0] for v in vectors_set]
  y_data = [v[1] for v in vectors_set]

The name_scope is not doing anything, since you are not creating any tensorflow ops there, just declaring Python lists. Instead, you should consider using placeholders.

Upvotes: 4

Related Questions