Reputation: 817
I got two numpy arrays (image and and environment map),
MatA
MatB
Both with shapes (256, 512, 3)
When I did the multiplication (element-wise) with numpy:
prod = np.multiply(MatA,MatB)
I got the wanted result (visualize via Pillow when turning back to Image)
But when I did it using pytorch, I got a really strange result(not even close to the aforementioned).
I did it with the following code:
MatATensor = transforms.ToTensor()(MatA)
MatBTensor = transforms.ToTensor()(MatB)
prodTensor = MatATensor * MatBTensor
For some reasons, the shape for both MatATensor and MatBtensor is
torch.Size([3, 256, 512])
Same for the prodTensor too.
When I tried to reshape to (256,512,3)
, I got an error.
Is there a way to get the same result?
I am new to pytorch, so any help would be appreciated.
Upvotes: 3
Views: 2528
Reputation: 6115
I suggest you use torch.from_numpy
, which will easily convert your ndarray
s to torch tensors. As in:
In[1]: MatA = np.random.rand(256, 512, 3)
In[2]: MatB = np.random.rand(256, 512, 3)
In[3]: MatA_torch = torch.from_numpy(MatA)
In[4]: MatB_torch = torch.from_numpy(MatB)
In[5]: mul_np = np.multiply(MatA, MatB)
In[6]: mul_torch = MatA_torch * MatB_torch
In[7]: torch.equal(torch.from_numpy(mul_np), mul_torch)
Out[7]: True
In[8]: mul_torch.shape
Out[8]: torch.Size([256, 512, 3])
If you want it back to numpy, just do:
mul_torch.numpy()
Upvotes: 1
Reputation: 114796
If you read the documentation of transforms.ToTensor()
you'll see this transformation does not only convert a numpy array to torch.FloatTensor
, but also transpose its dimensions from H
xW
x3
to 3
xH
xW
.
To "undo" this you'll need to
prodasNp = (prodTensor.permute(2, 0, 1) * 255).to(torch.uint8).numpy()
See permute
for more information.
Upvotes: 2