Reputation: 2088
The torch command
x = torch.Tensor(4, 3)
is supposed to create an uninitialized tensor (based on documentations). But when we try to print the content of x, there are values there.
>>>from __future__ import print_function
>>>print(x)
0.0000e+00 -8.5899e+09 6.1021e-38
8.5920e+09 1.7470e-21 4.5806e-41
0.0000e+00 0.0000e+00 0.0000e+00
0.0000e+00 0.0000e+00 0.0000e+00
[torch.FloatTensor of size 4x3]
So what is the meaning of uninitialized here?
Upvotes: 2
Views: 497
Reputation: 24129
It means that PyTorch just reserves a certain area within the memory for the tensor, without changing its content.
This part of the memory was before occupied by something else (an other tensor or or maybe something completely different like Browser, Code Editor .. if you use CPU memory). The values inside are not cleared afterwards for performance reasons.
The content (what previously might be something entirely different) is just interpreted as values for tensor.
Writing zeros or some other initialization requires computational power, so just reserving the area in memory is much faster.
But the values are also completely uncontrolled, values can grow very high, so in many cases you might do additional initialization.
Upvotes: 2