Reputation: 123
I have a list of tensors of the same shape.
I would like to sum the entire list of tensors along an axis.
Does torch.cumsum
perform this op along a dim?
If so it requires the list to be converted to a single tensor and summed over?
Upvotes: 11
Views: 31111
Reputation: 1296
you don't need cumsum
, sum
is your friend
and yes you should first convert them into a single tensor with stack
or cat
based on your needs, something like this:
import torch
my_list = [torch.randn(3, 5), torch.randn(3, 5)]
result = torch.stack(my_list, dim=0).sum(dim=0).sum(dim=0)
print(result.shape) #torch.Size([5])
Upvotes: 17