Reputation:
I have a word embedding (of type tensor)of size torch.Size([8, 768])
stored in the variable embeddings
which looks something like this :-
tensor([[-0.0687, -0.1327, 0.0112, ..., 0.0715, -0.0297, -0.0477],
[ 0.0115, -0.0029, 0.0323, ..., 0.0277, -0.0297, -0.0599],
[ 0.0760, 0.0788, 0.1640, ..., 0.0574, -0.0805, 0.0066],
...,
[-0.0110, -0.1773, 0.1143, ..., 0.1397, 0.3021, 0.1670],
[-0.1379, -0.0294, -0.0026, ..., -0.0966, -0.0726, 0.1160],
[ 0.0466, -0.0113, 0.0283, ..., -0.0735, 0.0496, 0.0963]],
grad_fn=<IndexBackward>)
Now, I wish to take mean of some embeddings and place the mean back in the tensor. For example,(I'll explain with the help of list and not tensor)
a = [1,2,3,4,5]
output = [1.5, 3, 4, 5]
So, here I have taken mean of 1 and 2 and then placed in the list output
by shifting the elements to the left in the list. I want to do the same thing for tensors as well.
I have the index stored in variable i
from where I need to take average and j
variable is being used for the stopping index. Now, let's look into the code :-
if i != len(embeddings):
sum = 0
count = 0
#Calculating sum
for x in range(i-1, j):
sum += text_index[x]
count += 1
avg = sum/count
#Inserting the average in place of the other embeddings
embeddings = embeddings[:i-1] + [avg] + embeddings[j:]
else :
pass
Now, I am getting an error at this line embeddings = embeddings[:i-1] + [avg] + embeddings[j:]
The error is :-
TypeError: unsupported operand type(s) for +: 'Tensor' and 'list'
Now, I understand that the above code would have worked well if embeddings
was a list but it is a tensor. How do I do it?
NOTE :
*1. *embeddings.shape : torch.Size([8, 768])
2. avg is of type float**
Upvotes: 0
Views: 2599
Reputation: 33010
To concatenate multiple tensors you can use torch.cat
, where the list of tensors are concatenate across the specified dimensions. That requires that all tensors have the same number of dimensions and all dimensions except the one that they are concatenated on, need to have the same size.
Your embeddings
has size [8, 768], therefore the left part and the right part will have size [num_left, 768] and [num_right, 768] respectively. And avg
has size [768] (it's a tensor, not a single float) since you averaged multiple embeddings into one. In order to concatenate them with the two other parts, it needs to have size [1, 768], so that it can be concatenated on the first dimension to create a tensor of size [num_left + 1 + num_right, 768]. The singular first dimension can be added with torch.unsqueeze
.
embeddings = torch.cat([embeddings[:i-1], avg.unsqueeze(0), embeddings[j:]], dim=0)
The for loop can also be replaced by slicing the tensor and taking the mean with torch.mean
.
# keepdim=True keeps the dimension that the average is taken on
# So the output has size [1, 768] instead of [768]
avg = torch.mean(embeddings[i-1:j], dim=0, keepdim=True)
embeddings = torch.cat([embeddings[:i-1], avg, embeddings[j:]], dim=0)
Upvotes: 0