Reputation: 61455
I have a sample tensor like this:
In [137]: x = x.new_ones((5, 3), dtype=torch.double)
In [138]: x
Out[138]:
tensor([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]], dtype=torch.float64)
Now, I want to free the memory of this tensor by overwriting the contents using torch.empty()
which takes an out
argument.
In [139]: torch.empty((5, 3), out=x)
Out[139]:
tensor([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]], dtype=torch.float64)
However, the values in the original tensor x
still remains the same. If this is the case, then what is the purpose of this keyword argument out
in torch.empty
? What am I missing here?
Upvotes: 1
Views: 198
Reputation: 29720
Here's the C++ implementation of empty
with an out param from the source code.
Tensor& empty_out(Tensor& result, IntList size) {
if (result.is_sparse()) {
result.sparse_resize_and_clear_(size, size.size(), 0);
} else {
result.resize_(size);
}
return result;
}
So for dense tensors all it does is resize the tensor appropriately - in your case the size is the same.
In [21]: x = torch.ones((5, 3), dtype=torch.double)
In [22]: torch.empty((2, 3), out=x)
Out[22]:
tensor([[1., 1., 1.],
[1., 1., 1.]], dtype=torch.float64)
In [23]: torch.empty((2, 8), out=x)
Out[23]:
tensor([[ 1.0000e+00, 1.0000e+00, 1.0000e+00, 1.0000e+00, 1.0000e+00,
1.0000e+00, 1.0000e+00, 1.0000e+00],
[ 1.0000e+00, 1.0000e+00, 1.0000e+00, 1.0000e+00, 1.0000e+00,
1.0000e+00, 1.0000e+00, 4.6631e-310]], dtype=torch.float64)
First of all, empty
doesn't free memory - it only cares about allocation of a tensor of an appropriate size. In your case such a tensor has already been allocated, so empty
has nothing to do.. it is not going to go allocate a new empty tensor somewhere else in memory. In the second empty
example above we are forced to allocate for a tensor with a larger size (2 * 8 = 16 compared to 5 * 3 = 15), and we can see the last element in this empty array is garbage, since it is beyond the contiguous memory block that had previously been initialized. empty
won't force-clear your whole tensor to 0 or something like that because again, it is uninitialized data.
Upvotes: 1