kyraus
kyraus

Reputation: 48

Scipy optimize curve_fit weird behavior

I want to fit an exponential curve to data points, but for some reason i get weird results from the function curve_fit from scipy. Here is my code:

from scipy.optimize import curve_fit
def f(x,lambda_):
    return 100*((1 - np.exp(-lambda_*np.array(x)))) 

x=[18.75, 30.75]
y=[48.69, 49.11]
lambda_ = curve_fit(f, x, y)[0]
plt.scatter(x, y, color='black')
xx=np.linspace(0, 40)
yy = f(xx, lambda_)
plt.plot(xx,yy)

Resulting in this plot :

enter image description here

Though it's clearly not the lambda that minimizes the quadratic error. Anyone knows why?

Thanks for help

Upvotes: 0

Views: 264

Answers (1)

Andrea Di Iura
Andrea Di Iura

Reputation: 457

It is related to the starting point p0 in curve_fit, by default it is set to one.

Setting p0=[-0.6] you can find a better solution.

A simple way to find good starting points is through a numeric scan, as an example:

n_trials = 100

err_best = np.inf
lambda_best_ = None
p0_best = None
for _ in range(n_trials):
    p0 = np.random.default_rng().uniform(-10, 10)
    try:
        lambda_ = curve_fit(f, x, y, p0=[p0])[0]
        err = ((f(x, lambda_) - y)**2).sum()
        #print(err, lambda_)
        if err < err_best:
            err_best = err
            lambda_best_ = lambda_
            p0_best = p0
    except Exception:
        pass

Upvotes: 1

Related Questions