Lau
Lau

Reputation: 1466

Compute the sum of all gradients in tensorflow

Is there any way to get all gradients values of a backpropagation? I want to calculate the sum of all gradients. Is there any fast method to obtain this and plot it in tensorboard with tf.summary.salar ?

Upvotes: 2

Views: 1848

Answers (1)

Phizaz
Phizaz

Reputation: 622

You can get the explicit gradient tensor via using optimizer.compute_gradients() (refer to: https://www.tensorflow.org/api_docs/python/tf/train/Optimizer#compute_gradients) which will return a list of (grad, var) pairs.

Here is an example:

# given some optimizer
optimizer = tf.train.AdamOptimizer()
# and some loss
loss = ... some loss ...

# var_list = None, means all variable in the graph
grads_vars = optimizer.compute_gradients(loss=loss, var_list=None)

# OR:
# I prefer to state it explicitly 
# if you use tf.keras.Model this is straightforward
grads_vars = optimizer.compute_gradients(loss=loss, var_list=model.variables)

# average the grads
mean_grad = tf.zeros(())
for grad, var in grads_vars:
    mean_grad += tf.reduce_mean(grad)
mean_grad /= len(grads_vars)

Just put mean grad to the tf.summary

Or you could use the tf.GradientTape() flavor (which requires newer version of Tensorflow).

# tfe = tf.contrib.eager
with tfe.GradientTape() as tape:
    y_hat = model(x)
    loss = tf.losses.mean_squared_error(y, y_hat)

grads = tape.gradient(loss, model.variables)
# average the grads
mean_grad = tf.zeros(())
for grad in grads:
    mean_grad += tf.reduce_mean(grad)
mean_grad /= len(grads)

Note that on newer versions of Tensorflow tfe.GradientTape works on both eager and graph modes.

Upvotes: 2

Related Questions