Reputation: 441
I have some problems figuring out how to rotate a matrix in a 2d plane since my coordinate system is not a standard mathematical one, my plane has an inverted y-axis, meaning higher y value is lower on the screen. I also want to rotate the matrix clockwise instead of the standard anticlockwise.
So if I try to illustrate how I want it to work:
O = origin X = points to rotate
Then 0 degrees looks like this:
XXX
O
I want 90 degrees to look like this:
X
OX
X
180 degrees should look like this:
O
XXX
270 degrees should look like this:
X
XO
X
Any ideas on how to calculate the new x and y for a point after rotating in this plane?
Upvotes: 2
Views: 877
Reputation: 6834
The clockwise rather than anti-clockwise just means flipping the sign on the angle.
To get the full result, we just transform into 'standard coords', do the rotation, and transform back:
The coordinate transform and its inverse is:
(x') = ( 1 0 ) (x)
(y') ( 0 -1 ) (y)
A rotation anti-clockwise is:
(x') = ( cos(angle) -sin(angle) ) (x)
(y') ( sin(angle) cos(angle) ) (y)
So a rotation clockwise is:
(x') = ( cos(angle) sin(angle) ) (x)
(y') ( -sin(angle) cos(angle) ) (y)
Altogether this gives:
(x') = ( 1 0 )( cos(angle) sin(angle) ) ( 1 0 )(x)
(y') ( 0 -1 )( -sin(angle) cos(angle) ) ( 0 -1 )(y)
Multiply the matrices to get:
(x') = ( cos(angle) sin(angle) ) (x)
(y') ( -sin(angle) cos(angle) ) (y)
Now, as you may by now have realized, this is actually the same matrix as rotating 'standard coords' in the anti-clockwise direction.
Or in code:
// Angle in radians
double x2 = cos(angle) * x1 - sin(angle) * y1;
double y2 = sin(angle) * x1 + cos(angle) * y1;
For example, if angle is 180 degrees, cos(angle)
is -1
, and sin
is 0
, giving:
double x2 = -x1;
double y2 = -y1;
Upvotes: 2