I-love-python
I-love-python

Reputation: 75

How to calculate the distance between two points on lines in python

I have two lines. namely (x1,y1) and (x2,y2). I need to calculate the distance between the points. see my code snippets below

import numpy as np
import plotly.express as px
import plotly.graph_objects as go

x1= np.array([525468.80914272, 525468.70536016])
y1= np.array([175517.80433391, 175517.75493122])

x2= np.array([525468.81174, 525468.71252])
y2= np.array([175517.796305, 175517.74884 ])

Here is the code for the plot:

fig= go.Figure()

fig.add_trace(go.Scatter(x=x1, y=y1, name="point1"))
fig.add_trace(go.Scatter(x=x2, y=y2, name="point2"))

See the figure here

1

The black line is the distance I want to calculate

my expectations are: (0.008438554274975979, 0.0085878435595034274819)

Upvotes: 6

Views: 1558

Answers (4)

Kien Nguyen
Kien Nguyen

Reputation: 2701

You can use Distance formula based on Pythagorean Theorem: AB=√((xA-xB)²+(yA-yB)²) where (xA, yA), (xB, yB) are the coordinates of the two points A, B.

Apply to your problem:

import math

distance_1  = math.sqrt(((x1[0] - x2[0]) ** 2) + ((y1[0] - y2[0]) ** 2))
distance_2  = math.sqrt(((x1[1] - x2[1]) ** 2) + ((y1[1] - y2[1]) ** 2))

print(distance_1, distance_2)

Output:

0.008438557910490769 0.009400333483144686

Upvotes: 0

l.b.vasoya
l.b.vasoya

Reputation: 1221

Here you can calculate the distance using the Pythagoras theorem here I have defined two ways

enter image description here

  • You can get a by a=x1-x2 and b=y1-y2 the distance is c=? passing the value in a formula like
 import math
 a=x1-x2
 b=y1-y2

 c=abs(math.sqrt((a** 2) + (b ** 2)))

Here in your case

import math
a1 = x1[0] - x2[0]
a2 = x1[1] - x2[1]
b1 = y1[0] - y2[0]
b2 = y1[1] - y2[1]

distance_a  = abs(math.sqrt((a1** 2) + (b1** 2)))
distance_b  = abs(math.sqrt((a2** 2) + (b2 ** 2)))
  • you can use direct hypot method which give the same answer like see the equation of hypot method hypot
import math
a1 = x1[0] - x2[0]
a2 = x1[1] - x2[1]
b1 = y1[0] - y2[0]
b2 = y1[1] - y2[1]

distance_a  = math.hypot(a1,b1)
distance_b  = math.hypot(a2,b2)

These all are ways you can use any of them but eventually, you get the distances

Upvotes: 1

joostblack
joostblack

Reputation: 2535

The distance here is just the l2 or euclidian norm. You can use numpy for this.

import numpy as np
distance_1 = np.linalg.norm(np.array([x1[0]-x2[0],y1[0]-y2[0]]))
distance_2 = np.linalg.norm(np.array([x1[1]-x2[1],y1[1]-y2[1]]))

print(distance_1,distance_2)

output:

0.008438557910490769 0.009400333483144686

The default norm that np.linalg.norm uses is the euclidian norm. (the distance the black lines represent)

Upvotes: 2

frux09
frux09

Reputation: 66

You can solve this with math library

import math

distancePointA  = math.sqrt(((x1[0] - x2[0]) ** 2) + ((y1[0] - y2[0]) ** 2))
distancePointB  = math.sqrt(((x1[1] - x2[1]) ** 2) + ((y1[1] - y2[1]) ** 2))

Upvotes: 3

Related Questions