YON
YON

Reputation: 1737

Tensorboard v1.0 - Histogram tab interpretation

I am learning to visualize tensor via tensorboard, however, I don't know how to interpret chart in Histogram tab. I used below code to visualize:

sess = tf.Session()
tf.summary.histogram('test', tf.constant([1, 1, 2, 2, 3, 4, 4, 4, 4]))
summary = tf.summary.merge_all()

train_writer = tf.summary.FileWriter('../tmp/train', sess.graph)
for i in range(10):
    sum = sess.run(summary)
    train_writer.add_summary(sum, i)

I got this chart from tensorboard:

Histogram mode: offset enter image description here

Histogram mode: overlay enter image description here

I know the x-axis is the value, y-axis is time step, what I don't know is the value of z-axis. According to this issue,

It's a normalized density. I wouldn't describe it as a probability density, although I think calling it one would be justifiable.

Can anyone explain more (i.e. how the density is calculated)?

Upvotes: 1

Views: 1178

Answers (1)

Salvador Dali
Salvador Dali

Reputation: 222511

The plot here shows approximately what it should have shown. Spikes at the values 1, 2, 3, 4. The biggest spike is at the value 4, the smallest at 3. You see such a strange results because you selected to see the distribution of the output which is hard to visualize as a distribution (the same way as it would not be impressive to look at the circle in 3-d program).

Plot the actual distribution and it will be easier to understand. Here is an example:

import tensorflow as tf
import numpy as np

v = np.random.normal(loc=5,  scale=3.0, size=100000)
a = tf.constant(v)
s = tf.summary.histogram('normal', a)

merged = tf.summary.merge_all()
with tf.Session() as sess:
    writer = tf.summary.FileWriter('logs', sess.graph)
    for i in xrange(10):
        summary = sess.run(merged)
        writer.add_summary(summary, i)

    writer.close()

Here you see the normal distribution with mean 5 and std=3. enter image description here

Now 10 stacked histograms starting from 0 to 9 are coming from your loop. They are all the same because there is nothing going on with the value a. In real work you will see the histogram of your tensors that evolve after each time of the training.

Regarding your image, I assume they smoothen the output and this is why you see such results.

Upvotes: 1

Related Questions