Reputation: 1
curve a and b are two great circle segments that lie on a sphere. How can I decide whether these two circle segments intersect with each other?
Upvotes: 0
Views: 64
Reputation:
Assume you know the angle between the planes containing the circles as well as their distance to the center of the sphere (equivalently, the radii); the problem can be solved in 2D: make one of the planes horizontal and the intersection of the two planes in the viewing direction. What you see is an angle formed by two lines (traces of the planes), one being horizontal. You have all needed information to establish the equations of the lines and find the intersection point. Check if it belongs to the apparent outline of the sphere.
Upvotes: 1
Reputation: 80327
I assume you need only a fact of intersection, not intersection point (in the second case just look at Intersection of two paths given start points and bearings
section of link below)
Let we have points a1, a2, b1, b2
in lat/lon coordinates.
Paths a1a2
and b1b2
do intersect, if both b1
and b2
lie in different hemispheres relative to a1a2
big circle, and a1
and a2
lie in different hemispheres relative to b1b2
big circle.
To determine hemisheres, we can calculate signs of cross-track distances as described in Cross-track distance
of this page
Sign_(pt1, pt2, pt3) = sign(δ13) ⋅ sign(θ13−θ12)
where δ13 is (angular) distance from start point to third point
θ13 is (initial) bearing from start point to third point
θ12 is (initial) bearing from start point to end point
So check that
Sign_(a1, a2, b1) != Sign_(a1, a2, b2)
Sign_(b1, b2, a1) != Sign_(b1, b2, a2)
Angular distance (Haversine formula):
a = sin²(Δφ/2) + cos φ1 ⋅ cos φ2 ⋅ sin²(Δλ/2)
c = 2 ⋅ atan2( √a, √(1−a) )
where φ is latitude, λ is longitude
Bearing:
θ = atan2( sin Δλ ⋅ cos φ2 , cos φ1 ⋅ sin φ2 − sin φ1 ⋅ cos φ2 ⋅ cos Δλ )
where φ1,λ1 is the start point,
φ2,λ2 the end point
(Δλ is the difference in longitude)
Upvotes: 2