Malo Marrec
Malo Marrec

Reputation: 609

How do I use tensor board with tf.layers?

As the weights are not explicitly defined, how can I pass them to a summary writer?

For exemple:

conv1 = tf.layers.conv2d(
    tf.reshape(X,[FLAGS.batch,3,160,320]),
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
    )

=>

summarize_tensor(
    ??????
)

Thanks!

Upvotes: 5

Views: 2122

Answers (2)

John
John

Reputation: 377

While Da Tong's answer is complete, it took me a while to realize how to use it. To save time for another beginner, you need to add the following to you code to add all trainable variables to the tensorboard summary:

for var in tf.trainable_variables():
    tf.summary.histogram(var.name, var)
merged_summary = tf.summary.merge_all()

Upvotes: 9

Da Tong
Da Tong

Reputation: 2026

That depends on what you are going to record in TensorBoard. If you want to put every variables into TensorBoard, call tf.all_variables() or tf.trainable_variables() will give you all the variables. Note that the tf.layers.conv2d is just a wrapper of creating a Conv2D instance and call apply method of it. You can unwrap it like this:

conv1_layer = tf.layers.Conv2D(
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
)

conv1 = conv1_layer.apply(tf.reshape(X,[FLAGS.batch,3,160,320]))

Then you can use conv1_layer.kernel to access the kernel weights.

Upvotes: 5

Related Questions