Royi
Royi

Reputation: 4963

Computation of Lower Rank Matrix

Given I have a matrix 3x3 A of rank 3. I want to create a matrix of Rank 2 which is closest to A in the $ {l}_{2} $ / Frobenius norm. Let's call this matrix F.

Is easy to achieve by the SVD, namely, if $ A = U S {V}^{H} $ by the SVD decomposition $ F = U \hat{S} {V}^{H} $. Where $ \hat{S} $ is the same as $ S $ with the last Singular Value zeroed.

The question is, is there a less computationally intensive method to create F but using the SVD decomposition?

Thanks.

Upvotes: 4

Views: 514

Answers (3)

user85109
user85109

Reputation:

If you KNOW the matrix to be rank 3, then exactly 3 Householder transformations will be sufficient to reduce an nxm matrix to a 3xm matrix. One can now easily convert this into an eigenvalue problem. Compute the characteristic polynomial. For example, consider this matrix (I'll use MATLAB to do the work):

>> A = rand(10,3);
>> B = A*rand(3,6);

Clearly, B will have rank 3, but if you don't trust me, rank confirms that claim.

>> rank(B)
ans =
     3

>> size(B)
ans =
    10     6

So B is a 10x6 matrix, with rank 3. The SVD confirms that fact also.

>> svd(B)
ans =
          6.95958965358531
          1.05904552889497
         0.505730235622534
      7.37626877572817e-16
      2.36322066654691e-16
      1.06396598411356e-16

I'm feeling too lazy to write the Householder transformations. (I've got some code for that laying around, but...) So I'll use QR to help me out.

>> [Q,R] = qr(B,0);
>> C = Q(:,1:3)'*B
C =
   -2.0815   -1.7098   -3.7897   -1.6186   -3.6038   -3.0809
    0.0000    0.91329   0.78347   0.44597  -0.072369   0.54196
    0.0000    0.0000   -0.2285   -0.43721  -0.85949  -0.41072

The multiply here shows what we would have seen after three Householder transformations. See that it is upper triangular as we would expect. Next, compute the characteristic polynomial. I'll so it symbolically here, using my own tools, but the computation is just a bit of algebra.

>> sympoly lambda
>> det(C*C' - lambda*eye(3))
ans =
    13.8942 + 66.9996*lambda + 49.8132*lambda^2 + lambda^3

>> P = det(C*C' - lambda*eye(3))
P =
    13.8942 - 66.9996*lambda + 49.8132*lambda^2 - lambda^3

What are the roots of P, so the eigenvalues of C*C'?

>> r = roots(P)
r =
       48.436
       1.1216
      0.25576

We know that the eigenvalues must be the squares of the singular values here, so a good test here is to see if we recovered the singlular values that svd found. So expanding the display format again, we see it did so nicely.

>> sqrt(roots(P))
ans =
          6.95958965358531
          1.05904552889497
         0.505730235622533

Mathematics can be fun when it works. What can we do with this information? If I know a specific eigenvalue, I can compute the corresponding eigenvector. Essentially, we need to solve the linear 3x3 homogeneous system of equations

(C*C' - eye(3)*r(3)) * X = 0

Again, I'll be lazy here to find the solution without actually writing any code. Gaussian elimination would do it though.

>> V = null((C*C' - eye(3)*r(3)))
V =
        -0.171504758161731
        -0.389921448437349
         0.904736084157367

So we have V, an eigenvector of C*C'. We can convince ourselves that it is one by the following call to svd.

>> svd(C - V*(V'*C))
ans =
           6.9595896535853
          1.05904552889497
      2.88098729108798e-16

So by subtracting off that component of C in the direction of V, we get a rank 2 matrix.

Similarly, we can transform V into the original problem space, and use that to transform the matrix B, our original matrix, by a rank one update of B.

>> U = Q(:,1:3)*V;
>> D = B - U*(U'*B);

What is the rank of D?

>> svd(D)
ans =
          6.95958965358531
          1.05904552889497
      2.62044567948618e-15
      3.18063391331806e-16
      2.16520878207897e-16
      1.56387805987859e-16

>> rank(D)
ans =
     2

As you can see, even though I did a lot of mathematics, calling svd, QR, rank, etc., all multiple times, in the end, the actual computation was relatively trivial. I needed only...

  1. 3 householder transformations. (Store them for later use. Note that a Householder transformation is simply a rank one update to your matrix.)
  2. Computate the characteristic polynomial.
  3. Compute the smallest root using your favorite method for a cubic polynomial.
  4. Recover the corresponding eigenvector. Gaussian elimination is sufficient here.
  5. Back that eigenvector into the original space using the householder transformations we originally did.
  6. A rank one update to the matrix.

All of these computational steps would be fast and efficient for ANY size matrix, as long as we know the actual rank is 3. Not even worth a paper on the subject.

EDIT:

Since the question has been revised such that the matrix itself is only of size 3x3, the computation is even more trivial. I'll leave the first part of the post as is however, since it describes a fully valid solution for a general matrix of any size that is of rank 3.

If your goal is to kill off the last singular value of a 3x3 matrix, the svd on a 3x3 seems like it would be pretty efficient to do. There will also be some loss of precision in generating the last singular value by indirect means. For example, compare here the smallest singular value computed by svd, then by using the eigenvalues. So you may see some small error here, since forming A'*A will cause some of precision. The extent of this loss will probably depend on the condition number of A.

>> A = rand(3);
>> sqrt(eig(A'*A))
ans =
        0.0138948003095069
         0.080275195586312
          1.50987693453097
>> svd(A)
ans =
          1.50987693453097
        0.0802751955863124
        0.0138948003095054

However, if you really want to do the work yourself, you could do it.

  1. Compute the characteristic polynomial directly from the 3x3 matrix A'*A.
  2. Compute the roots of that cubic polynomial. Choose the smallest root.
  3. Generate the corresponding eigenvector.
  4. Do a rank one update on A, killing off the part of A that lies in the direction of that eigenvector.

Is this any simpler or less computationally efficient than a simple call to the svd, followed by a rank one update? I'm not at all sure it is worth the effort on a 3x3. A 3x3 svd really is extremely fast to compute.

Upvotes: 5

Michael J. Barber
Michael J. Barber

Reputation: 25042

You can use power iteration to find the singular vectors corresponding to the largest singular value. Once you have that, you can use a modified version of power iteration to find the vectors for the second largest singular value; after each iteration you'll need to subtract off the projection of the partial solution vector onto the vectors for the largest singular value.

This is a poor approach for finding all the singular vectors, but should work just fine for the first two.

Upvotes: 0

kkm mistrusts SE
kkm mistrusts SE

Reputation: 5530

You probably already considered that, but if A is normal, the SVD can be computed by eigenvalue decomposition. This amounts to solving the characteristic polynomial, and, since it is a degree 3 polynomial for a rank 3 matrix, the roots are found by well-known cubics formulae.

It also looks like SVD in general must be reducible to solving cubics for a matrix of rank 3, but I cannot remember reading anything on that. A quick google-around turned this piece of code that claims to solve the rank 3 SVD in this manner. There is no accompanying paper, unfortunately. If using this code, testing it with a non-positive-definite matrix should be a good test case.

Edit: On the second reading, it seems that the author there using eigendecomposition as well. Likely not good on non-PD matrices, but I'd like to be proven wrong here.

Upvotes: 0

Related Questions