Reputation: 2489
I am considering to move my code base to tf.estimator.Estimator, but I cannot find an example on how to use it in combination with tensorboard summaries.
MWE:
import numpy as np
import tensorflow as tf
tf.logging.set_verbosity(tf.logging.INFO)
# Declare list of features, we only have one real-valued feature
def model(features, labels, mode):
# Build a linear model and predict values
W = tf.get_variable("W", [1], dtype=tf.float64)
b = tf.get_variable("b", [1], dtype=tf.float64)
y = W*features['x'] + b
loss = tf.reduce_sum(tf.square(y - labels))
# Summaries to display for TRAINING and TESTING
tf.summary.scalar("loss", loss)
tf.summary.image("X", tf.reshape(tf.random_normal([10, 10]), [-1, 10, 10, 1])) # dummy, my inputs are images
# Training sub-graph
global_step = tf.train.get_global_step()
optimizer = tf.train.GradientDescentOptimizer(0.01)
train = tf.group(optimizer.minimize(loss), tf.assign_add(global_step, 1))
return tf.estimator.EstimatorSpec(mode=mode, predictions=y,loss= loss,train_op=train)
estimator = tf.estimator.Estimator(model_fn=model, model_dir='/tmp/tf')
# define our data set
x=np.array([1., 2., 3., 4.])
y=np.array([0., -1., -2., -3.])
input_fn = tf.contrib.learn.io.numpy_input_fn({"x": x}, y, 4, num_epochs=1000)
for epoch in range(10):
# train
estimator.train(input_fn=input_fn, steps=100)
# evaluate our model
estimator.evaluate(input_fn=input_fn, steps=10)
How can I display my two summaries in tensorboard? Do I have to register a hook in which I use a tf.summary.FileWriter
or something else?
Upvotes: 13
Views: 16086
Reputation: 39
estimator = tf.estimator.Estimator(model_fn=model, model_dir='/tmp/tf')
Code model_dir='/tmp/tf'
means estimator write all logs to /tmp/tf
, then run tensorboard --log.dir=/tmp/tf
, open you browser with url: http://localhost"6006 ,you can see the graphic
Upvotes: 2
Reputation: 715
EDIT:
Upon testing (in v1.1.0, and probably in later versions as well), it is apparent that tf.estimator.Estimator
will automatically write summaries for you. I confirmed this using OP's code and tensorboard.
(Some poking around r1.4 leads me to conclude that this automatic summary writing occurs due to tf.train.MonitoredTrainingSession
.)
Ultimately, the automatic summarizing is accomplished with the use of hooks, so if you wanted to customize the Estimator's default summarizing, you could do so using hooks. Below are the (edited) details from the original answer.
You'll want to use hooks, formerly known as monitors. (Linked is a conceptual/quickstart guide; the short of it is that the notion of hooking into / monitoring training is built into the Estimator API. A bit confusingly, though, it doesn't seem like the deprecation of monitors for hooks is really documented except in a deprecation annotation in the actual source code...)
Based on your usage, it looks like r1.2's SummarySaverHook
fits your bill.
summary_hook = tf.train.SummarySaverHook(
SAVE_EVERY_N_STEPS,
output_dir='/tmp/tf',
summary_op=tf.summary.merge_all())
You may want to customize the hook's initialization parameters, as by providing an explicity SummaryWriter or writing every N seconds instead of N steps.
If you pass this into the EstimatorSpec
, you'll get your customized Summary behavior:
return tf.estimator.EstimatorSpec(mode=mode, predictions=y,loss=loss,
train_op=train,
training_hooks=[summary_hook])
EDIT NOTE:
A previous version of this answer suggested passing the summary_hook
into estimator.train(input_fn=input_fn, steps=5, hooks=[summary_hook])
. This does not work because tf.summary.merge_all()
has to be called in the same context as your model graph.
Upvotes: 17
Reputation: 1323
For me this worked without adding any hooks or merge_all
calls. I just added some tf.summary.image(...)
in my model_fn
and when I train the model they magically appear in tensorboard. Not sure what the exact mechanism is, however. I'm using TensorFlow 1.4.
Upvotes: 8
Reputation: 417
You can create a SummarySaverHook
with tf.summary.merger_all()
as the summary_op in the model_fn itself. Pass this hook to the training_hooks
param of the EstimatorSpec
constructor in your model_fn.
I don't think what @jagthebeetle said is exactly applicable here. As the hooks that you transfer to the estimator.train
method cannot be run for the summaries that you define in your model_fn, since they won't be added to the merge_all
op as they remain bounded by the scope of model_fn
Upvotes: 1