Reputation: 117771
In a programming language (Python, C#, etc) I need to determine how to calculate the angle between a line and the horizontal axis?
I think an image describes best what I want:
Given (P1x,P1y) and (P2x,P2y) what is the best way to calculate this angle? The origin is in the topleft and only the positive quadrant is used.
Upvotes: 264
Views: 249718
Reputation: 136665
import math
from collections import namedtuple
Point = namedtuple("Point", ["x", "y"])
def get_angle(p1: Point, p2: Point) -> float:
"""Get the angle of this line with the horizontal axis."""
dx = p2.x - p1.x
dy = p2.y - p1.y
theta = math.atan2(dy, dx)
angle = math.degrees(theta) # angle is in (-180, 180]
if angle < 0:
angle = 360 + angle
return angle
For testing I let hypothesis generate test cases.
import hypothesis.strategies as s
from hypothesis import given
@given(s.floats(min_value=0.0, max_value=360.0))
def test_angle(angle: float):
epsilon = 0.0001
x = math.cos(math.radians(angle))
y = math.sin(math.radians(angle))
p1 = Point(0, 0)
p2 = Point(x, y)
assert abs(get_angle(p1, p2) - angle) < epsilon
Upvotes: 4
Reputation: 409
deltaY = Math.Abs(P2.y - P1.y);
deltaX = Math.Abs(P2.x - P1.x);
angleInDegrees = Math.atan2(deltaY, deltaX) * 180 / PI
if(p2.y > p1.y) // Second point is lower than first, angle goes down (180-360)
{
if(p2.x < p1.x)//Second point is to the left of first (180-270)
angleInDegrees += 180;
else //(270-360)
angleInDegrees += 270;
}
else if (p2.x < p1.x) //Second point is top left of first (90-180)
angleInDegrees += 90;
Upvotes: 0
Reputation: 77
A formula for an angle from 0 to 2pi.
There is x=x2-x1 and y=y2-y1.The formula is working for
any value of x and y. For x=y=0 the result is undefined.
f(x,y)=pi()-pi()/2*(1+sign(x))*(1-sign(y^2))
-pi()/4*(2+sign(x))*sign(y)
-sign(x*y)*atan((abs(x)-abs(y))/(abs(x)+abs(y)))
Upvotes: 0
Reputation: 2332
matlab function:
function [lineAngle] = getLineAngle(x1, y1, x2, y2)
deltaY = y2 - y1;
deltaX = x2 - x1;
lineAngle = rad2deg(atan2(deltaY, deltaX));
if deltaY < 0
lineAngle = lineAngle + 360;
end
end
Upvotes: 0
Reputation: 32898
First find the difference between the start point and the end point (here, this is more of a directed line segment, not a "line", since lines extend infinitely and don't start at a particular point).
deltaY = P2_y - P1_y
deltaX = P2_x - P1_x
Then calculate the angle (which runs from the positive X axis at P1
to the positive Y axis at P1
).
angleInDegrees = arctan(deltaY / deltaX) * 180 / PI
But arctan
may not be ideal, because dividing the differences this way will erase the distinction needed to distinguish which quadrant the angle is in (see below). Use the following instead if your language includes an atan2
function:
angleInDegrees = atan2(deltaY, deltaX) * 180 / PI
EDIT (Feb. 22, 2017): In general, however, calling atan2(deltaY,deltaX)
just to get the proper angle for cos
and sin
may be inelegant. In those cases, you can often do the following instead:
(deltaX, deltaY)
as a vector.deltaX
and deltaY
by the vector's length (sqrt(deltaX*deltaX+deltaY*deltaY)
), unless the length is 0.deltaX
will now be the cosine of the angle between the vector and the horizontal axis (in the direction from the positive X to the positive Y axis at P1
).deltaY
will now be the sine of that angle.EDIT (Feb. 28, 2017): Even without normalizing (deltaX, deltaY)
:
deltaX
will tell you whether the cosine described in step 3 is positive or negative.deltaY
will tell you whether the sine described in step 4 is positive or negative.deltaX
and deltaY
will tell you which quadrant the angle is in, in relation to the positive X axis at P1
:
+deltaX
, +deltaY
: 0 to 90 degrees.-deltaX
, +deltaY
: 90 to 180 degrees.-deltaX
, -deltaY
: 180 to 270 degrees (-180 to -90 degrees).+deltaX
, -deltaY
: 270 to 360 degrees (-90 to 0 degrees).An implementation in Python using radians (provided on July 19, 2015 by Eric Leschinski, who edited my answer):
from math import *
def angle_trunc(a):
while a < 0.0:
a += pi * 2
return a
def getAngleBetweenPoints(x_orig, y_orig, x_landmark, y_landmark):
deltaY = y_landmark - y_orig
deltaX = x_landmark - x_orig
return angle_trunc(atan2(deltaY, deltaX))
angle = getAngleBetweenPoints(5, 2, 1,4)
assert angle >= 0, "angle must be >= 0"
angle = getAngleBetweenPoints(1, 1, 2, 1)
assert angle == 0, "expecting angle to be 0"
angle = getAngleBetweenPoints(2, 1, 1, 1)
assert abs(pi - angle) <= 0.01, "expecting angle to be pi, it is: " + str(angle)
angle = getAngleBetweenPoints(2, 1, 2, 3)
assert abs(angle - pi/2) <= 0.01, "expecting angle to be pi/2, it is: " + str(angle)
angle = getAngleBetweenPoints(2, 1, 2, 0)
assert abs(angle - (pi+pi/2)) <= 0.01, "expecting angle to be pi+pi/2, it is: " + str(angle)
angle = getAngleBetweenPoints(1, 1, 2, 2)
assert abs(angle - (pi/4)) <= 0.01, "expecting angle to be pi/4, it is: " + str(angle)
angle = getAngleBetweenPoints(-1, -1, -2, -2)
assert abs(angle - (pi+pi/4)) <= 0.01, "expecting angle to be pi+pi/4, it is: " + str(angle)
angle = getAngleBetweenPoints(-1, -1, -1, 2)
assert abs(angle - (pi/2)) <= 0.01, "expecting angle to be pi/2, it is: " + str(angle)
All tests pass. See https://en.wikipedia.org/wiki/Unit_circle
Upvotes: 400
Reputation: 1957
Considering the exact question, putting us in a "special" coordinates system where positive axis means moving DOWN (like a screen or an interface view), you need to adapt this function like this, and negative the Y coordinates:
Example in Swift 2.0
func angle_between_two_points(pa:CGPoint,pb:CGPoint)->Double{
let deltaY:Double = (Double(-pb.y) - Double(-pa.y))
let deltaX:Double = (Double(pb.x) - Double(pa.x))
var a = atan2(deltaY,deltaX)
while a < 0.0 {
a = a + M_PI*2
}
return a
}
This function gives a correct answer to the question. Answer is in radians, so the usage, to view angles in degrees, is:
let p1 = CGPoint(x: 1.5, y: 2) //estimated coords of p1 in question
let p2 = CGPoint(x: 2, y : 3) //estimated coords of p2 in question
print(angle_between_two_points(p1, pb: p2) / (M_PI/180))
//returns 296.56
Upvotes: 1
Reputation: 5392
Based on reference "Peter O".. Here is the java version
private static final float angleBetweenPoints(PointF a, PointF b) {
float deltaY = b.y - a.y;
float deltaX = b.x - a.x;
return (float) (Math.atan2(deltaY, deltaX)); }
Upvotes: 1
Reputation:
I have found a solution in Python that is working well !
from math import atan2,degrees
def GetAngleOfLineBetweenTwoPoints(p1, p2):
return degrees(atan2(p2 - p1, 1))
print GetAngleOfLineBetweenTwoPoints(1,3)
Upvotes: 1
Reputation: 545
Sorry, but I'm pretty sure Peter's answer is wrong. Note that the y axis goes down the page (common in graphics). As such the deltaY calculation has to be reversed, or you get the wrong answer.
Consider:
System.out.println (Math.toDegrees(Math.atan2(1,1)));
System.out.println (Math.toDegrees(Math.atan2(-1,1)));
System.out.println (Math.toDegrees(Math.atan2(1,-1)));
System.out.println (Math.toDegrees(Math.atan2(-1,-1)));
gives
45.0
-45.0
135.0
-135.0
So if in the example above, P1 is (1,1) and P2 is (2,2) [because Y increases down the page], the code above will give 45.0 degrees for the example shown, which is wrong. Change the order of the deltaY calculation and it works properly.
Upvotes: 52