Reputation: 3128
I have two Pytorch tensors (really, just 1-D lists), t1
and t2
. Is it possible to iterate over them in parallel, i.e. do something like
for a,b in zip(t1,t2)
?
Thanks.
Upvotes: 12
Views: 15885
Reputation: 56
To zip tensors in PyTorch into one use torch.stack
with dim=1
Example
t1 = torch.tensor([1, 2, 3])
t2 = torch.tensor([10, 20, 30])
t3 = torch.tensor([100, 200, 300])
res = torch.stack((t1, t2, t3), dim=1)
#output
#tensor([[ 1, 10, 100],
# [ 2, 20, 200],
# [ 3, 30, 300]])
Upvotes: 3
Reputation: 2365
For me (Python version 3.7.3 and PyTorch version 1.0.0) the zip function works as expected with PyTorch tensors:
>>> import torch
>>> t1 = torch.ones(3)
>>> t2 = torch.zeros(3)
>>> list(zip(t1, t2))
[(tensor(1.), tensor(0.)), (tensor(1.), tensor(0.)), (tensor(1.), tensor(0.))]
The list
call is just needed to display the result. Iterating over zip
works normally.
Upvotes: 11
Reputation: 808
Your can try:
torch.stack(seq, dim=0, out=None) → Tensor
,
for details see pytoch documentation
Upvotes: 6
Reputation: 19895
It would make more sense to concatenate them with torch.cat(dim=1)
; then, you can iterate over the new tensor.
Upvotes: 1