PewPew
PewPew

Reputation: 1

How to apply Least Squares for curve fitting of Parametric Polynomial Cubic Curve

Given the parametric planar curve, r(u)=(x(u), y(u)) whereby x=f(u), y=g(u), perform curve fitting to find an approximation to r(u) with a parametric polynomial cubic curve. Will need to apply Least Squares estimation to solve the problem.

Clarifications I would like to ask:

  1. Do I do a curve fitting individually for x=f(u) and y=f(u) and then combine the estimated x and y values from the polynomial curve produced to plot an estimation of r(u)?

  2. Do I do the curve fitting for r(u) and find the polynomial parametric cubic curve accordingly?

  3. Do I apply a linear least squares approach or a non-linear least squares approach? I would think it is non-linear since the aim is to produce a parametric polynomial cubic curve.

  4. Will it be possible to solve the problem by applying the least_squares function, specifically Gauss-Newton method, in scipy?

Hoping I could get some help with coding the above in Python.

Thank you very much.

Tried with least_squarss in python but to no avail.

Expecting a methodological approach to this

Upvotes: 0

Views: 326

Answers (2)

Kroa
Kroa

Reputation: 1

I just faced a similar problem, hopefully this answer is still useful.

Method 1 - individually regressing x and y - seems to work well. Here is an implementation in Python using scipy least_squares.

from scipy.optimize import least_squares
def polynome(x, coefs):
    x_powers = np.stack([x**i for i in range(coefs.shape[-1])], axis=1)
    y = np.sum(coefs*x_powers, axis=1)
    return y

def fit_parametric_poly(Y,u,degrees=[3,3], inputs_size=1000):

    """
    Y array of shape [n_samples,n_dims]
    u the parametric values in range [0,1]
    degrees the degrees of each polynomial as a list
    inputs_size the number of samples for interpolation
    """
    coefs_list = []
    for i in range(Y.shape[1]):
        y = np.ravel(Y[:,i])
        coefs = np.ones(degrees[i]+1)
        #function to regress on parametric values u
        def loss_func(coefs):
            return y - polynome(u, coefs)
        res = least_squares(loss_func,coefs)
        coefs_list.append(res.x)

    inputs = np.linspace(0, 1,inputs_size)
    preds = np.stack([polynome(inputs,coefs) for coefs in coefs_list], axis=1)

    return inputs, preds, coefs_list

In case you have outliers, you can change the loss to huber or soft_l1 for a more robust curve.

Upvotes: 0

try the polynomialFeature curve fitting library

poly_features = PolynomialFeatures(degree=2, include_bias=False)
X_poly_2 = poly_features.fit_transform(X)

Upvotes: 0

Related Questions