rootpetit
rootpetit

Reputation: 461

Pytorch how to stack tensor like for loop

I want to concat tensor generated in for loop, and get 2dTensor.
standard python, like below.

li = []
for i in range(0, len(items)):
    # calc something
    li.append(calc_result)

In my case, in for loop, generate torch.Size([768]) Tensor, and I want to get torch.Size([len(item),768]) Tensor.
How to do this?

Upvotes: 5

Views: 15992

Answers (2)

James Hirschorn
James Hirschorn

Reputation: 8046

The accepted answer using torch.stack is incorrect because it inserts an additional dimension, giving a tensor of shape [1, len(items), 768].

Use torch.vstack instead:

torch.vstack(li, dim=0)

to get a tensor of shape [len(items), 768].

Upvotes: 0

iacolippo
iacolippo

Reputation: 4513

You can use torch.stack:

torch.stack(li, dim=0)

after the for loop will give you a torch.Tensor of that size.

Note that if you know in advance the size of the final tensor, you can allocate an empty tensor beforehand and fill it in the for loop:

x = torch.empty(size=(len(items), 768))
for i in range(len(items)):
    x[i] = calc_result

This is usually faster than doing the stack.

Upvotes: 11

Related Questions