Sandipan Banerjee
Sandipan Banerjee

Reputation: 61

Pytorch equivalent features in tensorflow?

I recently was reading a Pytorch code and came across loss.backward() and optimizer.step() functions, are there any equivalent of these using tensorflow/keras?

Upvotes: 6

Views: 2221

Answers (1)

user11530462
user11530462

Reputation:

loss.backward() equivalent in tensorflow is tf.GradientTape(). TensorFlow provides the tf.GradientTape API for automatic differentiation - computing the gradient of computation with respect to its input variables. Tensorflow "records" all operations executed inside the context of a tf.GradientTape onto a "tape". Tensorflow then uses that tape and the gradients associated with each recorded operation to compute the gradients of a "recorded" computation using reverse mode differentiation.

optimizer.step() equivalent in tensorflow is minimize(). Minimizes the loss by updating the variable list. Calling minimize() takes care of both computing the gradients and applying them to the variables.

If you want to process the gradients before applying them you can instead use the optimizer in three steps:

  1. Compute the gradients with tf.GradientTape.
  2. Process the gradients as you wish.
  3. Apply the processed gradients with apply_gradients().

Hope this answers your question. Happy Learning.

Upvotes: 13

Related Questions