Alessandro
Alessandro

Reputation: 794

How can I remap a point after an image rotation?

I have a mathematical question: let's suppose I rotate an image around its center by an angle of 30°, using the opencv with the following commands:

M = cv2.getRotationMatrix2D((cols/2,rows/2),30,1)
img_rotate = cv2.warpAffine(img,M,(cols,rows))

If a take the pixel (40,40) of the img_rotate, how can I know which is the corresponding pixel in the original image?

EDIT: in other words, when I apply the rotation to an image I obtain the transformed image. Is there the possibility to obtain the mapping between points? For example the (x,y) point of the new image corresponds to (x',y') point of the original image.

Upvotes: 14

Views: 14326

Answers (2)

Alexey U.
Alexey U.

Reputation: 356

Just use matrix operation as described in Affine Transformations and inverse matrix.

# inverse matrix of simple rotation is reversed rotation.
M_inv = cv2.getRotationMatrix2D((100/2, 300/2),-30,1)


# points
points = np.array([[35.,  0.],
                   [175., 0.],
                   [105., 200.],
                   [105., 215.],
                  ])
# add ones
ones = np.ones(shape=(len(points), 1))

points_ones = np.hstack([points, ones])

# transform points
transformed_points = M_inv.dot(points_ones.T).T

Upvotes: 24

user2983637
user2983637

Reputation: 142

You can use transform() function to apply given transformation to arrays of points.

cv2.transform(pointsToTransform, M)

Upvotes: 3

Related Questions