Reputation: 11
For example, I need to compute the gradient of the cross_entropy
with respect to x
, but I need to apply another value to the gradient function.
That is:
f'(x)|x = x_t
I think tf.gradients()
function will only give the gradient at x = x
.
So does tensorflow provide any of this feature?
Upvotes: 1
Views: 975
Reputation: 53758
The result of tf.gradients
is a tensor (list of tensors in general), not a float value. In a way, this tensor is a function: it can be evaluated in any point. The client only needs to feed the desired input value.
Example:
features = 3
n_samples = 10
hidden = 1
X = tf.placeholder(dtype=tf.float32, shape=[n_samples, features])
Y = tf.placeholder(dtype=tf.float32, shape=[n_samples])
W = tf.Variable(np.ones([features, hidden]), dtype=tf.float32, name="weight")
b = tf.Variable(np.ones([hidden]), dtype=tf.float32, name="bias")
pred = tf.add(tf.matmul(X, W), b)
cost = tf.reduce_mean(tf.pow(pred - Y, 2))
dc_dw, dc_db = tf.gradients(cost, [W, b])
session.run(tf.global_variables_initializer())
# Let's compute `dc_dw` at `ones` matrix.
print(dc_dw.eval(feed_dict={X: np.ones([n_samples, features]),
Y: np.ones([n_samples])}))
Upvotes: 1