adeelz92
adeelz92

Reputation: 489

Concatenating two tensors with different dimensions in Pytorch

Is it possible to concatenate two tensors with different dimensions without using for loop.

e.g. Tensor 1 has dimensions (15, 200, 2048) and Tensor 2 has dimensions (1, 200, 2048). Is it possible to concatenate 2nd tensor with 1st tensor along all the 15 indices of 1st dimension in 1st Tensor (Broadcast 2nd tensor along 1st dimension of Tensor 1 while concatenating along 3rd dimension of 1st tensor)? The resulting tensor should have dimensions (15, 200, 4096).

Is it possible to accomplish this without for loop ?

Upvotes: 15

Views: 18124

Answers (1)

benjaminplanche
benjaminplanche

Reputation: 15149

You could do the broadcasting manually (using Tensor.expand()) before the concatenation (using torch.cat()):

import torch

a = torch.randn(15, 200, 2048)
b = torch.randn(1, 200, 2048)

repeat_vals = [a.shape[0] // b.shape[0]] + [-1] * (len(b.shape) - 1)
# or directly repeat_vals = (15, -1, -1) or (15, 200, 2048) if shapes are known and fixed...
res = torch.cat((a, b.expand(*repeat_vals)), dim=-1)
print(res.shape)
# torch.Size([15, 200, 4096])

Upvotes: 18

Related Questions