Gilad
Gilad

Reputation: 6585

OpenCV: findContours exception

my matlab code is:

h = fspecial('average', filterSize);
imageData = imfilter(imageData, h, 'replicate');
bwImg = im2bw(imageData, grayThresh);

cDist=regionprops(bwImg, 'Area');
cDist=[cDist.Area];

opencv code is:

cv::blur(dst, dst,cv::Size(filterSize,filterSize));
dst = im2bw(dst, grayThresh);

cv::vector<cv::vector<cv::Point> > contours;
cv::vector<cv::Vec4i> hierarchy;
cv::findContours(dst,contours,hierarchy,CV_RETR_CCOMP, CV_CHAIN_APPROX_NONE);

here is my image2blackand white function

cv::Mat AutomaticMacbethDetection::im2bw(cv::Mat src, double grayThresh)
{
    cv::Mat dst;
    cv::threshold(src, dst, grayThresh, 1, CV_THRESH_BINARY);
    return dst; 
}

I'm getting an exception in findContours() C++ exception: cv::Exception at memory location 0x0000003F6E09E0A0.

Can you please explain what am I doing wrong. dst is cv::Mat and I used it all along it has my original values.

Update here is my matrix written into *.txt file: http://www.filedropper.com/gili

UPDATE 2: I have added dst.convertTo(dst,CV_8U); like Micka suggested, I no longer have an exception. however values are nothing like expected.

Upvotes: 1

Views: 2360

Answers (1)

rayryeng
rayryeng

Reputation: 104503

Take a look at this question which has a similar problem to what you're encountering: Matlab and OpenCV calculate different image moment m00 for the same image.

Basically, the OP in the linked post is trying to find the zeroth image moment for both x and y of all closed contours - which is actually just the area, by using findContours in OpenCV and regionprops in MATLAB. In MATLAB, that can be accessed by the Area property from regionprops, and judging from your MATLAB code, you wish to find the same quantity.

From the post, there is most certainly a difference between how OpenCV and MATLAB finds contours in an image. This boils down to the way both platforms consider what is a "connected pixel". OpenCV only uses a four-pixel neighbourhood while MATLAB uses an eight-pixel neighbourhood.

As such, there is nothing wrong with your implementation, and converting to 8UC1 is good. However, the areas (and ultimately the total number of connected components and contours themselves) between both contours found with MATLAB and OpenCV are not the same. The only way for you to get exactly the same result is if you manually draw the contours found by findContours on a black image, and using the cv::moments function directly on this image.

However, because of the differing implementations of cv::blur() in comparison to fspecial with an averaging mask that is even, you still may not be able to get the same results along the borders of the image. If there are no important contours around the borders of your image, then hopefully this will give you the right result.


Good luck!

Upvotes: 2

Related Questions