River Liu
River Liu

Reputation: 339

OpenCV Mat rotation gets wrong result

I want to rotate an image by 90 degrees. My code is like following:

int main(int argc, const char * argv[]) {
    Mat img = imread("/Users/chuanliu/Desktop/src4/p00.JPG");

    resize(img, img, Size(1024, 683));
    imwrite("/Users/chuanliu/Desktop/resize.jpg", img);
    Mat dst;
    Mat rot_mat = getRotationMatrix2D(Point(img.cols / 2.0, img.rows / 2.0), 90, 1);
    warpAffine(img, dst, rot_mat, Size(img.rows, img.cols));

    imwrite("/Users/chuanliu/Desktop/roatation.jpg", dst);

    return 0;
}

But the result is like following:
Before rotation:
enter image description here

After rotation:
enter image description here

It seems that the center of rotation has sth wrong. But I don't think I set a wrong center. Is there anyone can tell me what is wrong?

Upvotes: 2

Views: 2017

Answers (2)

gavinb
gavinb

Reputation: 20018

The centre is specified in terms of the source image's dimensions Point(img.cols / 2.0, img.rows / 2.0) but you are not only rotating the image but swapping the width and height in the output size when calling warpAffine:

Size(img.rows, img.cols)

so it looks like you might need to specify the centre in terms of the output image coordinates; eg. Point(rows/2, cols/2).

Update:

Nope, that's not the solution. There is actually a very simple and efficient method for rotating an image by 90 degrees: using the cv::transpose() function:

int main()
{
    cv::Mat img = cv::imread("5syfi.jpg");
    cv::Mat img_rotated;

    cv::transpose(img, img_rotated);  

    cv::imwrite("out.jpg", img_rotated);

    return 0;
}

Using a combination of cv::transpose() (to rotate) and cv::flip() (to mirror vertically and horizontally) you can very quickly perform rotations by 90, 180 and 270 degrees.

Using warpAffine() is much more flexible, but it is also much more expensive (ie. slower) to calculate. So if you only need to rotate by a multiple of 90deg, use cv::transpose. If you need to rotate by an arbitrary angle, use the warpAffine/warpPerspective functions. @Micka's answer gives a great example of how to do that.

Upvotes: 3

Micka
Micka

Reputation: 20130

adapting my answer from:

OpenCV 2.4.3 - warpPerspective with reversed homography on a cropped image

you can use this code:

int main(int argc, const char * argv[]) {
cv::Mat img = cv::imread("../inputData/rotationInput.jpg");

cv::imshow("input", img);

cv::Mat dst;
cv::Mat rot_mat = cv::getRotationMatrix2D(cv::Point(img.cols / 2.0, img.rows / 2.0), 90, 1);
//cv::warpAffine(img, dst, rot_mat, cv::Size(img.rows, img.cols));

// since I didnt write the code for affine transformations yet, we have to embed the affine rotation matrix in a perspective transformation
cv::Mat perspRotation = cv::Mat::eye(3,3, CV_64FC1);
for(int j=0; j<rot_mat.rows; ++j)
    for(int i=0; i<rot_mat.cols; ++i)
    {
        perspRotation.at<double>(j,i) = rot_mat.at<double>(j,i);
    }

// image boundary corners:
std::vector<cv::Point> imageCorners;
imageCorners.push_back(cv::Point(0,0));
imageCorners.push_back(cv::Point(img.cols,0));
imageCorners.push_back(cv::Point(img.cols,img.rows));
imageCorners.push_back(cv::Point(0,img.rows));

// look at where the image will be placed after transformation:
cv::Rect warpedImageRegion = computeWarpedContourRegion(imageCorners, perspRotation);

// adjust the transformation so that the top-left corner of the transformed image will be placed at (0,0) coordinate
cv::Mat adjustedTransformation = adjustHomography(warpedImageRegion, perspRotation);

// finally warp the image
cv::warpPerspective(img, dst, adjustedTransformation, warpedImageRegion.size());



//mwrite("/Users/chuanliu/Desktop/roatation.jpg", dst);
cv::imwrite("../outputData/rotationOutput.png", dst);
cv::imshow("out", dst);
cv::waitKey(0);

return 0;
}

which uses these helper functions:

cv::Rect computeWarpedContourRegion(const std::vector<cv::Point> & points, const cv::Mat & homography)
{
    std::vector<cv::Point2f> transformed_points(points.size());

    for(unsigned int i=0; i<points.size(); ++i)
    {
        // warp the points
        transformed_points[i].x = points[i].x * homography.at<double>(0,0) + points[i].y * homography.at<double>(0,1) + homography.at<double>(0,2) ;
        transformed_points[i].y = points[i].x * homography.at<double>(1,0) + points[i].y * homography.at<double>(1,1) + homography.at<double>(1,2) ;
    }

    // dehomogenization necessary?
    if(homography.rows == 3)
    {
        float homog_comp;
        for(unsigned int i=0; i<transformed_points.size(); ++i)
        {
            homog_comp = points[i].x * homography.at<double>(2,0) + points[i].y * homography.at<double>(2,1) + homography.at<double>(2,2) ;
            transformed_points[i].x /= homog_comp;
            transformed_points[i].y /= homog_comp;
        }
    }

    // now find the bounding box for these points:
    cv::Rect boundingBox = cv::boundingRect(transformed_points);
    return boundingBox;
}


cv::Mat adjustHomography(const cv::Rect & transformedRegion, const cv::Mat & homography)
{
    if(homography.rows == 2) throw("homography adjustement for affine matrix not implemented yet");

    // unit matrix
    cv::Mat correctionHomography = cv::Mat::eye(3,3,CV_64F);
    // correction translation
    correctionHomography.at<double>(0,2) = -transformedRegion.x;
    correctionHomography.at<double>(1,2) = -transformedRegion.y;


    return correctionHomography * homography;
}

and produces this output for 90°:

enter image description here

and this output for 33°

enter image description here

btw, if you only want to rotate for 90°/180° there might be much more efficient and more accurate (concerning interpolation) methods than image warping!!

Upvotes: 2

Related Questions