Reputation: 138
I have a large matrix M (M is positive definite) with dimension (10000,10000) and I want to do the following operations:
r = transpose(e_i - e_j) * M * (e_i - e_j)
where e_i is zeros vector(10000 x 1) with all zero entries except at ith entry
I want to do this operation for (e_1,e_2)....(e_1,e_10000)... i.e all pair of i,j belong to {1,10000}.
I have try
-do the calculation directly
-Cholesky factorization
Nevertheless, both methods take too long. They took >=0.5 seconds to finished one calculation and therefore it is not feasible in time.
Is there any methods/library I could try to speed up this process?
Upvotes: 0
Views: 121
Reputation: 14480
Your expression is transpose(e_i - e_j) * M * (e_i - e_j)
. Let's take the last part which is M * (e_i - e_j)
and multiply it out to get M * e_i - M * e_j
.
Now since e_i
is zeros except for the i
th element M * e_i
is just selecting the i
th column of M
which we can do with no calculation as M[:,i]
. Taking the dot product with e_i
again this gives us M[i,i]
.
Multiplying your expression out fully and applying this idea to all 4 terms gives the answer @dmuir gives in the comment above:
r = M[i,i] - M[j,i] - M[i,j] + M[j,j]
This expression should be very fast and does not (in principle) depend on the size of your matrix at all. So you actually don't need any matrix operations at all to compute what you asked for.
Since you said that you want to do this for all i, j pairs you probably want to end up with a matrix R
so that R[i, j]
is as r
given above. You could do that in a loop over i, j
or you could vectorise that with something like
d = np.array([np.diag(a)])
R = d + d.T - M - M.T
For the size that you say I would expect this whole operation to complete in around 1 second.
Given the size of the matrices I would probably try to avoid temporaries with
d = np.array([np.diag(a)])
R = d + d.T
R -= M
R -= M.T
Upvotes: 0
Reputation: 6740
You should look into the Strassen Algorithm. It is designed for multiplying large matrices in O(n^2.8). Sample implementation here.
Upvotes: 1