metatheorem
metatheorem

Reputation: 598

Passing Dataset.from_tensor_slices a list vs a tuple

I noticed this subtlety while working through some tensorflow code (v1.13.1):

tf.enable_eager_execution()

for n in Dataset.from_tensor_slices(([1, 2], [3, 4])).make_one_shot_iterator():
    print(n)
#
# prints:
# (<tf.Tensor: id=8, shape=(), dtype=int32, numpy=1>, <tf.Tensor: id=9, shape=(), dtype=int32, numpy=3>)
# (<tf.Tensor: id=12, shape=(), dtype=int32, numpy=2>, <tf.Tensor: id=13, shape=(), dtype=int32, numpy=4>)
#

for n in Dataset.from_tensor_slices([[1, 2], [3, 4]]).make_one_shot_iterator():
    print(n)
#
# prints:
# tf.Tensor([1 2], shape=(2,), dtype=int32)
# tf.Tensor([3 4], shape=(2,), dtype=int32)
#

The difference above being the first loop passes the two tensors in a tuple, the second in a list. I expected the second loop to work the same as the first, slicing the tensors. Is this a deliberate difference in how tf treats incoming tuples and lists?

Upvotes: 3

Views: 1979

Answers (1)

metatheorem
metatheorem

Reputation: 598

Thanks @giser_yugang for the link / answer.

From the linked issue:

This is working as intended: the tf.data API uses Python lists to signify values that should be converted implicitly to tensors, and Python tuples to signify values that should be interpreted as multiple components of a (potentially nested) structure.

Probably the cause of a lot of subtle problems...

Upvotes: 5

Related Questions