Reputation: 41
I am trying to distort back an image in opencv. At first the image captured with a pinhole camera is undistort thanks to cv::undistort(raw, undist, cameraMatrix, distCoeffs);
which is working. Now I am trying to distort back undist
to it's original state with the patch that I found here: http://code.opencv.org/issues/1387
but so far I have not manage to make it work. Here is the code:
void distort(const cv::Mat& src, cv::Mat& dst, const cv::Mat& cameraMatrix, const cv::Mat& distCoeffs)
{
cv::Mat pixel_locations_src = cv::Mat(src.size(), CV_32FC2);
for (int i = 0; i < src.size().height; i++) {
for (int j = 0; j < src.size().width; j++) {
pixel_locations_src.at<cv::Point2f>(i,j) = cv::Point2f(j,i);
}
}
cv::Mat fractional_locations_dst = cv::Mat(src.size(), CV_32FC2);
cv::Mat pixel_locations_dst = cv::Mat(src.size(), CV_32FC2);
cv::undistortPoints(pixel_locations_src, pixel_locations_dst, cameraMatrix, distCoeffs);
const float fx = cameraMatrix.at<double>(0,0);
const float fy = cameraMatrix.at<double>(1,1);
const float cx = cameraMatrix.at<double>(0,2);
const float cy = cameraMatrix.at<double>(1,2);
// is there a faster way to do this?
for (int i = 0; i < fractional_locations_dst.size().height; i++) {
for (int j = 0; j < fractional_locations_dst.size().width; j++) {
const float x = fractional_locations_dst.at<cv::Point2f>(i,j).x*fx + cx;
const float y = fractional_locations_dst.at<cv::Point2f>(i,j).y*fy + cy;
pixel_locations_dst.at<cv::Point2f>(i,j) = cv::Point2f(x,y);
}
}
cv::remap(src, dst, pixel_locations_dst, cv::Mat(), CV_INTER_LINEAR);
}
I tried to pass a RGB image to the function but since undistortPoints
takes a 1*N, 2 channels matrix the code will fire an assertion at undistortPoints
I don't understand why distort()
takes a 1xN matrix as input.
Any light on the topic would be great. Thanks
Upvotes: 3
Views: 1235
Reputation: 41
I finally used a different approach, I only needed to distort back a certain set of points, here is the code:
void DistortPoints(const std::vector<cv::Point2f> & src, std::vector<cv::Point2f> & dst, const cv::Mat& cameraMatrix, const cv::Mat& distorsionMatrix)
{
double fx = cameraMatrix.at<double>(0,0);
double fy = cameraMatrix.at<double>(1,1);
double cx = cameraMatrix.at<double>(0,2);
double cy = cameraMatrix.at<double>(1,2);
std::vector<cv::Point3f> src2;
for (int i = 0; i < src.size(); i++)
src2.push_back(cv::Point3f((src[i].x - cx) / fx, (src[i].y - cy) / fy, 0));
cv::Mat rVec(3, 1, cv::DataType<double>::type, cv::Scalar(0)); // Rotation vector
cv::Mat tVec(3, 1, cv::DataType<double>::type, cv::Scalar(0)); // Translation vector
std::vector<cv::Point2f> dst2;
cv::projectPoints(src2, rVec, tVec, cameraMatrix, distorsionMatrix, dst);
}
Upvotes: 1