Reputation: 3
Assuming I have a a weight matrix that looks like [[a , b ], [c, d]]
, is it possible in Tensorflow to fix the values of b
and c
to zero such that they don't change during optimization?
Upvotes: 0
Views: 645
Reputation: 1104
Some sample code:
A = tf.Variable([[1., 0.], [3., 0.]])
A1 = A[:,0:1] # just some slicing of your variable
A2 = A[:,1:2]
A2_stop = tf.stop_gradient(tf.identity(A2))
A = tf.concat((A1, A2_stop), axis=1)
Actually, tf.identity
is needed to stop the gradient before A2.
Upvotes: 2
Reputation: 1635
There are three ways to do this, you can
Upvotes: 1