KansaiRobot
KansaiRobot

Reputation: 9912

What is the "data.max" of a torch.Tensor?

I have been browsing the documentation of torch.Tensor, but I have not been able to find this (just similar things).

If a_tensor is a torch.Tensor, what is a_tensor.data.max? What type, etc.?

In particular, I am reading a_tensor.data.max(1)[1] and a_tensor.data.max(1)[1][i].cpu().numpy().

Upvotes: 1

Views: 1340

Answers (2)

Ivan
Ivan

Reputation: 40628

When accessing .data you are accessing the underlying data of the tensor. The returned object is a Torch.*Tensor as well, however, it won't be linked to any computational graph.

Take this example:

>>> x = torch.rand(4, requires_grad=True)
>>> y = x**2

>>> y
tensor([0.5272, 0.3162, 0.1374, 0.3004], grad_fn=<PowBackward0>)

While y.data is somewhat detached from graph (no grad_fn function), yet it is not a copy of y as y.detach() would return:

>>> y.data
tensor([0.5272, 0.3162, 0.1374, 0.3004]

Therefore, if you modify y.data's components you end modifying y itself:

>>> y.data[0] = 1

>>> y
tensor([1.0000, 0.3162, 0.1374, 0.3004], grad_fn=<PowBackward0>)

Notice how the grad_fn didn't change there. If you had done y[0] = 1, grad_fn would have been updated to <CopySlices>. This shows that modify your tensor's data through .data is not accounted for in terms of gradient, i.e., you won't be able to backpropagate these operations. It is required to work with y - not y.data - when planning to use Autograd.


So, to give an answer to your question: a_tensor.data is a torch.*Tensor, same type as a_tensor, and a_tensor.data.max is a function bound to that tensor.

Upvotes: 3

Bedir Yilmaz
Bedir Yilmaz

Reputation: 4083

a_Tensor.data is of type torch.Tensor as well.

You can find the details related to Tensor.max() from here.

Upvotes: 0

Related Questions