Reputation: 898
My cost function involves the matrix
T=[[1.0-a,b],[a,1.0-b]]
I can define
import numpy as np
import tensorflow as tf
a=0.3
b=0.4
T = tf.Variable([[1.0-a,b],[a,1.0-b]]
and this works well in the optimization, but then I am saying that I have four variables: 1-a,b,a,1-b (the gradient has four elements). On the other hand, I would like my variables to be two: a and b (the gradient has two elements).
I thought of doing something like
var = tf.Variable([a,b])
T = tf.constant([[1.0-var[0],var[1]],[var[0],1.0-var[1]]])
but this does not work, outputing the following:
TypeError: List of Tensors when single Tensor expected
So how can I construct a tensor made of tf.Variable objects?
Thank you.
Upvotes: 2
Views: 256
Reputation: 59731
I think what you need is:
import tensorflow as tf
a = tf.Variable(0.3)
b = tf.Variable(0.4)
T = tf.convert_to_tensor([[1.0 - a, b], [a, 1.0 - b]]
Upvotes: 2
Reputation: 403128
Initialise T as a placeholder.
T = tf.placeholder(tf.float32, [2, 2])
When starting your session, compute T and pass it in through a feed_dict
:
with tf.Session() as sess:
a, b = .3, .4
inp = np.array([[1 - a, b], [a, 1 - b]])
sess.run(optimizer, feed_dict={T : inp})
Where optimizer
is the node that minimizes your cost function.
Upvotes: 1