anders
anders

Reputation: 75

How to get the gradients of loss with respect to activations in Tensorflow

In the cifar10 example, the gradients of loss with respect to parameters can be computed as follows:

grads_and_vars = opt.compute_gradients(loss)
for grad, var in grads_and_vars:
    # ...

Is there any way to get the gradients of loss with respect to activations (not the parameters), and watch them in Tensorboard?

Upvotes: 4

Views: 2325

Answers (1)

mrry
mrry

Reputation: 126194

You can use the tf.gradients() function to compute the gradient of any scalar tensor with respect to any other tensor (assuming the gradients are defined for all of the ops between those two tensors):

activations = ...
loss = f(..., activations)  # `loss` is some function of `activations`.

grad_wrt_activations, = tf.gradients(loss, [activation])

Visualizing this in TensorBoard is tricky in general, since grad_wrt_activation is (typically) a tensor with the same shape as activation. Adding a tf.histogram_summary() op is probably the easiest way to visualize

# Adds a histogram of `grad_wrt_activations` to the graph, which will be logged
# with the other summaries, and shown in TensorBoard.
tf.histogram_summary("Activation gradient", grad_wrt_activations)

Upvotes: 5

Related Questions