Reputation: 845
I would like to use cv2.pointPolygonTest to check if point is placed inside or outside contours, but I cannot figure out why it does not work.
This is the way I get contours:
img_name = a295121c-f893-43f5-8d00-6bfddbc19658.jpg"
im = cv2.imread(img_name)
im_gray = cv2.cvtColor(im, cv2.COLOR_RGB2GRAY)
ret, thresh = cv2.threshold(im_gray, 10, 255, cv2.THRESH_BINARY)
contours, hierarchy = cv2.findContours(thresh, cv2.RETR_LIST, cv2.CHAIN_APPROX_TC89_L1)
And this is how I check if point is inside or outside detected object.
x1 = cv2.pointPolygonTest(contours[0], (x, y), False)
On this tresholded image it works fine:
x1 is properly calculated when (x,y)
is inside object
np.shape(contours)
is equal to (1, 241, 1, 2)
However on this image:
all points are calculated as outside
np.shape(contours)
is equal to (11,)
I suppose I don't use contours in a proper way, but I cannot figure out which dimension of contours should I pass to cv2.pointPolygonTest()
Upvotes: 3
Views: 7232
Reputation: 1648
I wonder if your code to test the point is only this:
x1 = cv2.pointPolygonTest(contours[0], (x, y), False)
In this code, the function just test the first contour. Maybe in the second image there are multiple contours for please try to use a loop.
Upvotes: 3