Darshan Parab
Darshan Parab

Reputation: 85

TensorFlow v2 replacement for clip_gradients_by_norm

I'm going through the Machine Learning Crash course by Google Developers. The examples given are developed on TensorFlow < 2. I'm trying them on v2. Below code from examples creates a regressor object.

# Create a linear regressor object.
my_optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)
my_optimizer = tf.contrib.estimator.clip_gradients_by_norm(my_optimizer, 5.0)
  linear_regressor = tf.estimator.LinearRegressor(
      feature_columns=feature_columns,
      optimizer=my_optimizer
  )

I was able to find that tf.train.GradientDescentOptimizer is now moved to tf.optimizers.SGD but unable to find replacement for tf.contrib.estimator.clip_gradients_by_norm. After a bit of googling came to know that it is replaced by tf.clip_by_norms. But clip by norms takes tensor as input as opposed to clip_gradients_by_norm.

I'm new to tensorflow. Any help would be appriciated to figure out how to port the code.

Thanks.

Upvotes: 3

Views: 321

Answers (1)

Srihari Humbarwadi
Srihari Humbarwadi

Reputation: 2632

You can get the required functionality by setting the clipnorm argument while initializing the optimizer object.

optimizer = tf.optimizers.SGD(learning_rate=1e-5, clipnorm=1e-4)

Upvotes: 2

Related Questions