Maruf
Maruf

Reputation: 790

Error while clipping gradient

I was following this tensorflow tutorial for gradient clipping while working with a multilayer perceptron.

grads_and_vars = optimizer.compute_gradients(cross_entropy_loss, trainable_variable)
capped_grads_and_vars = [(tf.clip_by_global_norm(gv[0],5), gv[1]) for gv in grads_and_vars]
optimizer.apply_gradients(capped_grads_and_vars)

tensorflow shows the following error,

in clip_by_global_norm raise TypeError("t_list should be a sequence")

trainable_variable is a list which I created while creating the model. assume I have a trainable variable(tf.Variable), I add this variable to trainable_variable list by the following command.

trainable_variable.append(var) #where ver is a trainable variable in tensorflow

Upvotes: 3

Views: 280

Answers (1)

Maruf
Maruf

Reputation: 790

The key point of this type of problem is, trainable_variable list may contain multiple tensors who are not initialized or used in the graph. make sure you contain all the tensor safely in the trainable_variable list. Sometimes even they might contain NaN for gradient computation. This type of error may also introduce for unnatural value.

Upvotes: 1

Related Questions