figs_and_nuts
figs_and_nuts

Reputation: 5763

How to handle None in tf.clip_by_global_norm?

I have read in answers to this question here that tf.clip_by_global_norm() handles None values by simply ignoring them (comment by danijar in comments to the answer by @danijar) but when i try to apply it i seem to be doing something wrong as it throws

ValueError: None values not supported.

tf.reset_default_graph()
z = tf.get_variable(name = 'z', shape = [1])
b = tf.get_variable('b', [1])
c = b*b - 2*b + 1
optimizer = tf.train.AdamOptimizer(0.1)
gradients, variables = zip(*optimizer.compute_gradients(c))
gradients = tf.clip_by_global_norm(gradients, 2.5)
train_op = optimizer.apply_gradients(zip(gradients, variables))

Can somebody please tell me what am i doing wrong or if tf.clip_by_global_norm() does not handle None gradients and i have to take care of them manually

The official documentation seems to agree with @danijar's comments. see here

Any of the entries of t_list that are of type None are ignored.

Upvotes: 0

Views: 1405

Answers (1)

nessuno
nessuno

Reputation: 27052

There's a small problem in your code: you're assigning the return value of tf.clip_by_global_norm to a single variable, when this function returns a pair of values.

The documentation says:

Returns:

list_clipped: A list of Tensors of the same type as list_t.

global_norm: A 0-D (scalar) Tensor representing the global norm.

Hence, the problem arises when you try to apply the gradients to the variables, in the next line.

You can easily fix your code ignoring the global_norm returned value.

gradients, _ = tf.clip_by_global_norm(gradients, 2.5)

Upvotes: 1

Related Questions