Reputation: 31
I’m trying to stack 2 tensors A.shape=(64,16,16)
and B.shape=(64,16,16)
in a tensor of shape C.shape=(1,128,16,16)
and non of functions I’ve tried work where
torch.stack
=> C.shape=(2,64,16,16)
and
torch.cat
=> C.shape=(128,16,16)
can enyones help me
Upvotes: 1
Views: 199
Reputation: 7723
Concat first and then use unsqueeze
to add singleton dimension at 0th
position
torch.cat([A, B]).unsqueeze(0)
Upvotes: 3