Ruchit Patel
Ruchit Patel

Reputation: 795

Broadcasting element wise multiplication in pytorch

I have a tensor in pytorch with size torch.Size([1443747, 128]). Let's name it tensor A. In this tensor, 128 represents a batch size. I have another 1D tensor with size torch.Size([1443747]). Let's call it B. I want to do element wise multiplication of B with A, such that B is multiplied with all 128 columns of tensor A (obviously in an element wise manner). In other words, I want to broadcast the element wise multiplication along dimension=1. How can I achieve this in pytorch?

It I didn't have a batch size involved in the tensor A (batch size = 1), then normal * operator would do the multiplication easily. A*B then would have generated resultant tensor of size torch.Size([1443747]). However, I don't understand why pytorch is not broadcasting the tensor multiplication along dimension 1? Is there any way to do this?

What I want is, B should be multiplied with all 128 columns of A in an element wise manner. So, the resultant tensors' size would be torch.Size([1443747, 128]).

Upvotes: 1

Views: 3070

Answers (1)

Victor Zuanazzi
Victor Zuanazzi

Reputation: 1974

The dimensions should match, it should work if you transpose A or unsqueeze B:

C = A.transpose(1,0) * B    # shape: [128, 1443747]

or

C = A * B.unsqueeze(dim=1)  # shape: [1443747, 128]

Note that the shapes of the two solutions are different.

Upvotes: 2

Related Questions