Reputation: 1386
th> x = torch.rand(2)
[0.4943s]
th> y = torch.rand(2)
[0.0001s]
th> x
0.6115
0.4986
[torch.DoubleTensor of size 2]
[0.0002s]
th> z = torch.cat(x,y)
[0.0003s]
th> z
0.6115
0.4986
0.5171
0.1785
[torch.DoubleTensor of size 4]
[0.0001s]
th> z[1] = 3
[0.0001s]
th> z[1]
3
[0.0001s]
th> x[1]
0.61146148154512
Modifying z does not modify x. Is there any way of concatenating x and y such that modifying z does modify x?
Upvotes: 3
Views: 300
Reputation: 66850
You can achieve this kind of behaviour, but the other way around. You should start with a bigger tensor, your main "storage", and then you can create subtensors, which will share internal state.
See in particular :sub
method from torch (following code sample taken from Torch doc)
x = torch.Tensor(5, 6):zero()
> x
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
0 0 0 0 0 0
[torch.DoubleTensor of dimension 5x6]
y = x:sub(2,4):fill(1) -- y is sub-tensor of x:
> y -- dimension 1 starts at index 2, ends at index 4
1 1 1 1 1 1
1 1 1 1 1 1
1 1 1 1 1 1
[torch.DoubleTensor of dimension 3x6]
> x -- x has been modified!
0 0 0 0 0 0
1 1 1 1 1 1
1 1 1 1 1 1
1 1 1 1 1 1
0 0 0 0 0 0
[torch.DoubleTensor of dimension 5x6]
As you can see you have y
variable which is actually a part of x
, and changing its values - change x's too. This is very generic way thus you can share multiple parts of tensor.
Thus in your case it would be something like
z = torch.Tensor(4):zero()
x = z:sub(1, 2)
y = z:sub(3, 4)
x[1] = 2
y[2] = 8
print(z)
prints
2
0
0
8
[torch.DoubleTensor of size 4]
as desired.
Upvotes: 2