lolilukia
lolilukia

Reputation: 11

How can I optimize the 5-layer loop using functions provided by torch?

x is the tensor with the shape of (16, 10, 4, 25, 53), y has the same size as x.
mean's shape is (25, 53), the size of jc and ac are both (16, 10, 4).

How can I optimize the following expression with torch functions?

for k in range(x.size()[0]):
    for s in range(x.size()[1]):
        for u in range(x.size()[2]):
            for i in range(x.size()[3]):
                for j in range(x.size()[4]):
                    num1 += (x[k][s][u][i][j] - mean[i][j] - jc[k][s][u]) * (y[k][s][u][i][j] - mean[i][j] - ac[k][s][u])
                    num2 += (y[k][s][u][i][j] - mean[i][j] - jc[k][s][u]) ** 2
                    num3 += (y[k][s][u][i][j] - mean[i][j] - ac[k][s][u]) ** 2

Upvotes: 1

Views: 55

Answers (1)

Shai
Shai

Reputation: 114786

I think you are looking at broadcasting your tensors along singleton dimensions.
First, you need the number of dimensions to be the same, so if mean is of shape (25,53) then mean[None, None, None, ...] is of shape (1, 1, 1, 25, 53) - you did not change anything in the underlying data, but the number of dimensions is now 5 instead of only 2 and these singleton dimensions can be broadcast to the corresponding dimensions of x and y.

An optimized code using broadcasting will look something like:

num1 = ((x - mean[None, None, None, ...] - jc[..., None, None]) * (y - mean[None, None, None, ...] - ac[..., None, None])).sum()
num2 = ((y - mean[None, None, None, ...] - jc[..., None, None]) ** 2).sum()  # shouldn't it be x here?
num3 = ((y - mean[None, None, None, ...] - ac[..., None, None]) ** 2).sum()

Upvotes: 2

Related Questions