glethien
glethien

Reputation: 2471

Decomposition of essential matrix leads to wrong rotation and translation

I am doing some SfM and having troubles getting R and T from the essential matrix.

Here is what I am doing in sourcecode:

Mat fundamental = Calib3d.findFundamentalMat(object_left, object_right);
Mat E = new Mat();

Core.multiply(cameraMatrix.t(), fundamental, E); // cameraMatrix.t()*fundamental*cameraMatrix;
Core.multiply(E, cameraMatrix, E);

Mat R = new Mat();
Mat.zeros(3, 3, CvType.CV_64FC1).copyTo(R);

Mat T = new Mat();

calculateRT(E, R, T);

where `calculateRT` is defined as follows:

private void calculateRT(Mat E, Mat R, Mat T) {

    /*
     * //-- Step 6: calculate Rotation Matrix and Translation Vector
        Matx34d P;
        //decompose E 
        SVD svd(E,SVD::MODIFY_A);
        Mat svd_u = svd.u;
        Mat svd_vt = svd.vt;
        Mat svd_w = svd.w;
        Matx33d W(0,-1,0,1,0,0,0,0,1);//HZ 9.13
        Mat_<double> R = svd_u * Mat(W) * svd_vt; //
        Mat_<double> T = svd_u.col(2); //u3

        if (!CheckCoherentRotation (R)) {
            std::cout<<"resulting rotation is not coherent\n";
            return 0;
        }
     */
    Mat w = new Mat();
    Mat u = new Mat();
    Mat vt = new Mat();

    Core.SVDecomp(E, w, u, vt, Core.DECOMP_SVD); // Maybe use flags
    Mat W = new Mat(new Size(3,3), CvType.CV_64FC1);
    W.put(0, 0, W_Values);

    Core.multiply(u, W, R);
    Core.multiply(R, vt, R);

    T = u.col(2);
}

And here are the results of all matrizes after and during calculation.

    Number matches: 10299
    Number of good matches: 590
    Number of obj_points left: 590.0


         CameraMatrix: 
                        [1133.601684570312,         0,             639.5;
                               0 ,          1133.601684570312,     383.5;
                               0,                   0,               1]


       DistortionCoeff: [0.06604336202144623; 0.21129509806633; 0; 0; -1.206771731376648]


    Fundamental: 
    [4.209958176688844e-08, -8.477216249742946e-08, 9.132798068178793e-05;
    3.165719895008366e-07, 6.437858397735847e-07, -0.0006976204595236443;
    0.0004532506630569588, -0.0009224427024602799, 1]

    Essential: 
    [0.05410018455525099, 0, 0;
    0, 0.8272987826496967, 0;
    0, 0, 1]

    U: (SVD)
    [0, 0, 1;
     0, 0.9999999999999999, 0;
     1, 0, 0]

    W: (SVD) 
    [1; 0.8272987826496967; 0.05410018455525099]

    vt: (SVD)
    [0, 0, 1;
     0, 1, 0;
     1, 0, 0]


    R: 
    [0, 0, 0;
     0, 0, 0;
     0, 0, 0]

    T: 
    [1; 0; 0]

And for completion here are the image I am using: left and right.

Before calulation of FeaturePoints and so on, I am doing an undistrortion of the images.

Can someone point out where something is goind wrong or what I am doing wrong?

edit: Question Is it possible that my fundamental matrix is equals to the essential matrix as I am in the calibrated situation and Hartley and zissermann says:

„11.7.3 The calibrated case: In the case of calibrated cameras normalized image coordinates may be used, and the essential matrix E computed instead of the fundamental matrix”

Upvotes: 1

Views: 1113

Answers (2)

glethien
glethien

Reputation: 2471

I've found the misstake. This code is not doing the right matrix multiplication.

  Mat E = new Mat();
  Core.multiply(cameraMatrix.t(),fundamental, E); 
  Core.multiply(E, cameraMatrix, E);

I changed this to

  Core.gemm(cameraMatrix.t(), fundamental, 1, cameraMatrix, 1, E);

which is now doing the right matrix multiplication. As far as I can get ir from the documentation, Core.multiply is doing the multiplication for each element. not the dot product of row*col.

Upvotes: 1

BConic
BConic

Reputation: 8980

First, unless you computed the fundamental matrix by taking explicitly into account the inverse of the camera matrix, you are not in the calibrated case, hence the fundamental matrix you estimate is not an essential matrix. This is also quite easy to test: you just have to eigen-decompose the fundamental matrix and see whether the two non-zero eigen-values are equal (see § 9.6.1 in Hartley&Zisserman's book).

Second, both the fundamental matrix and the essential matrix are defined for two cameras and do not make sense if you consider only one camera. If you do have two cameras, with respective matrices K1 and K2, then you can obtain the essential matrix E12, given the fundamental matrix F12 (which maps points in I1 to lines in I2), using the following formula (see equation 9.12 in Hartley&Zisserman's book):

E12 = K2T . F12 . K1

In your case, you used K2 on both sides.

Upvotes: 0

Related Questions