sten
sten

Reputation: 7486

Iterating over torch tensor

What is the best and fastest way to iterate over Tensor. It is confusing why do I get tensor instead of value..

got this :

  [ x for x in t]
  Out[122]: [tensor(-0.12), tensor(-0.11), tensor(0.68), tensor(0.68), tensor(0.17)]

but expected this behavior :

  [ x for x in t.numpy() ]
  Out[123]: [-0.11932722, -0.114598714, 0.67563725, 0.6756373, 0.16548502]

I would prefer not to convert to numpy if possible ?

Upvotes: 3

Views: 3309

Answers (1)

Shai
Shai

Reputation: 114976

With numpy everything is simpler because np.arrays are just a collection of numbers always stored on CPU. Therefore, if you iterate over an np.array you get these float numbers.
However, in PyTorch, tensors store not only numbers but also their gradients. Additionally, PyTorch tensors may be stored on CPU or GPU. Thus, in order to preserve all this "side-information", PyTorch returns single-element tensors when iterating over a tensor.

If you insist on getting simple "numbers" from a tensor, you can use tensor.item():

 [x.item() for x in t]

Or, tensor.tolist():

t.tolist()

For more information on the differences between numpy np.arrays and torch.tensors, see this answer.

Upvotes: 3

Related Questions