Todd
Todd

Reputation: 439

Non-Linear Least Square Fitting Using Python

I am trying to find the optimal parameter values based on 2 dimensional data.
I searched for other Q&A threads and one of the questions looked similar to mine and the answer seemed to be duable link for me to replicate but I get the following error:

TypeError: only size-1 arrays can be converted to Python scalars

I rarely changed the codes but customized a bit only but seems like my equation does not accept Numpy arrays. Is there a way to address this issue and derive the parameter values?
Below is the code:

from scipy.optimize import curve_fit
import numpy as np
import matplotlib.pyplot as plt

t_data = np.array([0,5,10,15,20,25,27])
y_data = np.array([1771,8109,22571,30008,40862,56684,59101])

def func_nl_lsq(t, *args):
    K, A, L, b = args
    return math.exp(L*x)/((K+A*(b**x)))

popt, pcov = curve_fit(func_nl_lsq, t_data, y_data, p0=[1, 1, 1, 1])
plt.plot(t_data, y_data, 'o')
plt.plot(t_data, func_nl_lsq(t_data, *popt), '-')
plt.show()
print(popt[0], popt[1], popt[2], popt[3])

For more clarifications, I also attach a screenshot of the (1) equation that I want to run NLS fitting and the data observations from the journal paper that I've read.

enter image description here ... (1)

year(x_data) value(y_data)
1960 1771
1965 8109
1970 22571
1975 30008
1980 40862
1985 56684
1987 59101

Upvotes: 1

Views: 656

Answers (1)

Caridorc
Caridorc

Reputation: 6641

Ok so there was quite some work to do:

  • Set x = t (probably a typo)
  • Use np.exp instead of math.exp. np.exp is also used in the docs: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.curve_fit.html
  • Change the initial guesses to 0.1 as nonlinear optimization is sensible to initial guesses. Take care to choose guesses as near as possible to your ballpark estimate of the parameters order of magnitude. If you cannot estimate, run the fitting algorithm many times for many initial guesses, then you can select the (non-failed) fit with the smallest sum of residuals squared, see here learn how to calculate those residuals: Getting the r-squared value using curve_fit

Here is the final result:

from scipy.optimize import curve_fit
import numpy as np
import matplotlib.pyplot as plt
import math

t_data = np.array([0,5,10,15,20,25,27])
y_data = np.array([1771,8109,22571,30008,40862,56684,59101])

def func_nl_lsq(t, *args):
    K, A, L, b = args
    x=t # typo on your part?
    exponential = np.exp(L*x) # np.exp rather than math.exp
    denominator = K+A*(b**x)
    return exponential/denominator

popt, pcov = curve_fit(func_nl_lsq, t_data, y_data, p0=[0.1, 0.1, 0.1, 0.1]) # changed initial conditions
plt.plot(t_data, y_data, 'o')
plt.plot(t_data, func_nl_lsq(t_data, *popt), '-')
plt.show()
print(popt[0], popt[1], popt[2], popt[3])

With output:

7.36654485198494e-05 0.0011449352764853055 0.05547299686499709 0.5902083262954019

And plot:

enter image description here

Upvotes: 1

Related Questions