NadavRub
NadavRub

Reputation: 2610

Find the point minimizing the distance from a set of N lines

Given Multiple (N) lines in 3d space, find the point minimizing the distance to all lines.

  1. Given that the Shortest distance between a line [aX + b] and a point [P] will be on the perpendicular line [aX+b]–[P] I can express the minimal squared distance as the sum of squared line distances, eg. ([aX+b]–[P])^2 +…+ ([aX+b]n–[P])^2 .
  2. Since the lines are perpendicular I can use Dot Product to express [P] in the line terms

I have considered using Least Squares for estimating the point minimizing the distance, the problem is that the standard least squares will approximate the best fitting line/curve given a set of points, What I need is the opposite, given a set of lines estimate the best fitting point.

How should this be approached ?

Upvotes: 4

Views: 4035

Answers (5)

Luke Hutchison
Luke Hutchison

Reputation: 9230

You can apply the following answer (which talks about finding the point that is closest to a set of planes) to this problem, since just as a plane can be defined by a point on the plane and a normal to the plane, a line can be defined by a point the line passes through and a "normal" vector orthogonal to the line:

https://math.stackexchange.com/a/3483313/365886

You can solve the resulting quadratic form by observing that the solution to 1/2 x^T A x - b x + c is x_min = A^{-1} b.

Upvotes: 0

Vikram Bhat
Vikram Bhat

Reputation: 6246

Following is solution using calculus :-

F(x,y) = sum((y-mix-ci)^2/(1+mi^2))

Using Partial differentiation :-

dF(x,y)/dx = sum(2*(y-mix-ci)*mi/(1+mi^2))

dF(x,y)/dy = sum(2*(y-mix-ci)/(1+mi^2))

To Minimize F(x,y) :-

dF(x,y)/dy = dF(x,y)/dx = 0

Use Gradient Descent using certain learning rate and random restarts to solve find minimum as much as possible

Upvotes: 0

josliber
josliber

Reputation: 44340

From wikipedia, we read that the squared distance between line a'x + b = 0 and point p is (a'p+b)^2 / (a'a). We can therefore see that the point that minimizes the sum of squared distances is a weighted linear regression problem with one observation for each line. The regression model has the following properties:

  • Sample data a for each line ax+b=0
  • Sample outcome -b for each line ax+b=0
  • Sample weight 1/(a'a) for each line ax+b=0

You should be able to solve this problem with any standard statistical software.

Upvotes: 3

Ivaylo Strandjev
Ivaylo Strandjev

Reputation: 71009

I have solved the same problem using hill climbing. Consider a single point and 26 neighbours step away from it(points on a cube centered at the current point). If the distance from the point is better than the distance from all neighbours, divide step by 2, otherwise make the neighbor with best distance new current point. Continue until step is small enough.

Upvotes: 0

DrV
DrV

Reputation: 23550

An approach:

  • form the equations giving the distance from the point to each line
  • these equations give you N distances
  • optimize the set of distances by the criterion you want (least squares, minimax, etc.)

This reduces into a simple optimization question once you have the N equations. Of course, the difficulty of the last step depends heavily on the criterion you choose (least squares is simple, minimax not that simple.)

One thing that might help you forward is to find the simplest form of equation giving the distance from a point to line. Your thinking is correct in your #1, but you will need to think a bit more (or then check "distance from a point to line" with any search engine).

Upvotes: 1

Related Questions