marsv
marsv

Reputation: 11

stereo rectification with measured extrinsic parameters

I am trying to rectify two sequences of images for stereo matching. The usual approach of using stereoCalibrate() with a checkerboard pattern is not of use to me, since I am only working with the footage.

What I have is the correct calibration data of the individual cameras (camera matrix and distortion parameters) as well as measurements of their distance and angle between each other.

How can I construct the rotation matrix and translation vector needed for stereoRectify()?

The naive approach of using

Mat T = (Mat_<double>(3,1) << distance, 0, 0);
Mat R = (Mat_<double>(3,3) << cos(angle), 0, sin(angle), 0, 1, 0, -sin(angle), 0, cos(angle));

resulted in a heavily warped image. Do these matrices need to relate to a different origin point I am not aware of? Or do I need to convert the distance/angle value somehow to be dependent of pixelsize?

Any help would be appreciated.

Upvotes: 1

Views: 1534

Answers (2)

YuZ
YuZ

Reputation: 445

If intrinsics and extrinsics known, I recommend this method: http://link.springer.com/article/10.1007/s001380050120#page-1

It is easy to implement. Basically you rotate the right camera till both cameras have the same orientation, means both share a common R. Epipols are then transformed to the infinity and you have epipolar lines parallel to the image x-axis.

First row of the new R (x) is simply the baseline, e.g. the subtraction of both camera centers. Second row (y) the cross product of the baseline with the old left z-axis. Third row (z) equals cross product of the first two rows.

At last you need to calculate a 3x3 homography described in the above link and use warpPerspective() to get a rectified version.

Upvotes: 0

sansuiso
sansuiso

Reputation: 9379

It's not clear whether you have enough information about the camera poses to perform an accurate rectification. Both T and R are measured in 3D, but in your case:

  • T is one-dimensional (along x-axis only), which means that you are confident that the two cameras are perfectly aligned along the other axes (in particular, you have less-than-1 pixel error on the y axis, ie a few microns by today's standards);
  • R leaves the Y-coordinates untouched. Thus, all you have is rotation around this axis, does it match your experimental setup ?

Finally, you need to check the consistency of the units that you are using for the translation and rotation to match with the units from the intrinsic data.

If it is feasible, you can check your results by finding some matching points between the two cameras and proceeding to a projective calibration: the accurate knowledge of the 3D position of the calibration points is required for metric reconstruction only. Other tasks rely on the essential or fundamental matrices, that can be computed from image-to-image point correspondences.

Upvotes: 1

Related Questions