Reputation: 357
What I'm trying to define is the following idea:
Consider whe have these tensors
a = tf.constant([1., 1.5, 1.2]) # tensor with shape [3,]
b = tf.constant([1., 2., 3.]) # ""
c = tf.constant([3., 0., 6.]) # ""
t = tf.constant([0.5, 0.6, 0.7, 2., 4., 5., 6.]) # tensor with shape [7,]
Now let's consider I want to compute a new tensor, working with each element of previous tensors, for example:
def new_tensor(a, b, c, t):
X = tf.constant([[tf.sin(a*t[1]), b*t[3], c+t[4]],
[tf.cos(b*t[5]), tf.atan2(t[5], c), a+t[2]+b],
[a+t[4], a+b, c*t[0]]])
return X
X
should be a tensor with shape [3, 3, 3]
. That is, I want to define a function that takes four tensors as input: three of them of same shape and the fourth with a different one. I want the function to compute a tensor (X
) for each value of first three inputs (a, b, c
).
With this code TensorFlow gives this error:
TypeError: List of Tensors when single Tensor expected
According to this post, this is beacuse tf.constant
cannot take a tensor as input, and they recommend to use tf.Variable
instead. But I don't think that fits for me because I have to work later with X
and don't want to initializate it, etc. I have also read this other post, but couldn't find any answer to my problem.
Is there any way to do what I want? Does my code make sense for my purpose? Thank you in advance.
UPDATE: with jdehesa answer
Taking @jdehesa answer and making resulting tensor more simple:
def new_tensor(a, b, c, t):
# Could also use tf.convert_to_tensor
X = tf.stack([[a+t[1], b*t[1], c+t[1]],
[b*t[0], t[5]+ c, a+t[2]+b],
[a+t[4], a+b, c*t[0]]])
return X
And with tensors:
a = tf.constant([1., 1., 1.]) # tensor with shape [3,]
b = tf.constant([2., 2., 2.]) # ""
c = tf.constant([3., 3., 3.]) # ""
t = tf.constant([1., 1., 1., 1., 1., 1., 1.]) # tensor with shape [7,]
What I get is the following tensor:
# When evaluating x = new_tensor(a,b,c,t)
[[[2. 2. 2.]
[2. 2. 2.]
[4. 4. 4.]]
[[2. 2. 2.]
[4. 4. 4.]
[4. 4. 4.]]
[[2. 2. 2.]
[3. 3. 3.]
[3. 3. 3.]]]
But what I would expected is the following:
[[[2. 2. 4.]
[2. 4. 4.]
[2. 3. 3.]]
[[2. 2. 4.]
[2. 4. 4.]
[2. 3. 3.]]
[[2. 2. 4.]
[2. 4. 4.]
[2. 3. 3.]]]
As what I want it to evaluate to each element of input tensors.
Upvotes: 3
Views: 471
Reputation: 59681
That's correct, you can only pass Python or NumPy values to tf.constant
, but you can build your tensor with tf.stack
or, if you prefer, generally with tf.convert_to_tensor
:
import tensorflow as tf
def new_tensor(a, b, c, t):
# Could also use tf.convert_to_tensor
X = tf.stack([[tf.sin(a*t[1]), b*t[3], c+t[4]],
[tf.cos(b*t[5]), tf.atan2(t[5], c), a+t[2]+b],
[ a+t[4], a+b, c*t[0]]])
return X
with tf.Graph().as_default(), tf.Session() as sess:
a = tf.constant([1., 1.5, 1.2]) # tensor with shape [3,]
b = tf.constant([1., 2., 3.]) # ""
c = tf.constant([3., 0., 6.]) # ""
t = tf.constant([0.5, 0.6, 0.7, 2., 4., 5., 6.]) # tensor with shape [7,]
x = new_tensor(a, b, c, t)
print(sess.run(x))
# [[[ 0.5646425 0.7833269 0.65938467]
# [ 2. 4. 6. ]
# [ 7. 4. 10. ]]
#
# [[ 0.2836622 -0.8390715 -0.7596879 ]
# [ 1.0303768 1.5707964 0.69473827]
# [ 2.7 4.2 4.9 ]]
#
# [[ 5. 5.5 5.2 ]
# [ 2. 3.5 4.2 ]
# [ 1.5 0. 3. ]]]
EDIT: For your second example, to get the result that you want you need to use tf.transpose
to change the order of the dimensions of the tensor:
import tensorflow as tf
def new_tensor(a, b, c, t):
# Could also use tf.convert_to_tensor
X = tf.stack([[a+t[1], b*t[1], c+t[1]],
[b*t[0], t[5]+ c, a+t[2]+b],
[a+t[4], a+b, c*t[0]]])
X = tf.transpose(X, (2, 0, 1))
return X
with tf.Graph().as_default(), tf.Session() as sess:
a = tf.constant([1., 1., 1.]) # tensor with shape [3,]
b = tf.constant([2., 2., 2.]) # ""
c = tf.constant([3., 3., 3.]) # ""
t = tf.constant([1., 1., 1., 1., 1., 1., 1.]) # tensor with shape [7,]
x = new_tensor(a, b, c, t)
print(sess.run(x))
# [[[2. 2. 4.]
# [2. 4. 4.]
# [2. 3. 3.]]
#
# [[2. 2. 4.]
# [2. 4. 4.]
# [2. 3. 3.]]
#
# [[2. 2. 4.]
# [2. 4. 4.]
# [2. 3. 3.]]]
Upvotes: 4