Reputation: 61
I'm just beginning to learn TensorFlow and I have some problems with it.In training loop I want to ignore the small weights and stop training them. I've assigned these small weights to zero. I searched the tf API and found tf.Variable(weight,trainable=False)
can stop training the weight. If the value of the weight is equal to zero I will use this function. I tried to use .eval()
but there occurred an exception ValueError("Cannot evaluate tensor using eval()
: No default ". I have no idea how to get the value of the variable when in training loop. Another way is to modify the tf.train.GradientDescentOptimizer()
, but I don't know how to do it. Has anyone implemented this code yet or any other methods suggested? Thanks in advance!
Upvotes: 0
Views: 832
Reputation: 53758
I don't know any use-case for stopping training of some variables, probably it's not what you should do.
Anyway, calling tf.Variable()
(if I got you right) is not going to help you, because it's called just once when the graph is defined. The first argument is initial_value
: as the name suggests, it's assigned only during initialization.
Instead, you can use tf.assign like this:
with tf.Session() as session:
assign_op = var.assign(0)
session.run(assign_op)
It will update the variable during the session, which is what you're asking for.
Upvotes: 0
Reputation: 61
Are you looking to apply regularization to the weights?
There is an apply_regularization
method in the API that you can use to accomplish that.
See: How to exactly add L1 regularisation to tensorflow error function
Upvotes: 0