Reputation: 411
I have spent several hours trying to get my head around the scipy.optimize.minimize function.
I have got this working:
def poly_fun(coeffs,a,x):
predicted=10**np.polynomial.polynomial.polyval(np.log10(a),coeffs)
slope,intercept,r_value,p_value,std_err=scipy.stats.linregress(x,predicted)
return slope #intercept,r_value,p_value,std_err
res=minimize(poly_fun,x0=original_polynomial,args=(a,x),method='Nelder-Mead')
I also have a plotting function in my poly_fun to visulise what is going on.
Basically I want to improve the slope and intercept rather than the automatic of r2 of my polynomial. This poly is transforming something and then Is compare against a set of ‘known’ data to see how good the estimation is, loop and repeat to hopefully get an optimized polynomial.
Maybe I am missing constraints? I can’t work out how to use them.
In matlab, functions like fgoalattain
can take goal and weighting values during the optimization.
I would like to get my slope:1, r2:1 and intercept:0, or as close as possible. However, I can't work out which options in the function to use, or if I am using the wrong method or something. I haven't seen anything regarding goal attainment this in the documentation.
The code as is tries to make y =0 basically and flattens the linear trend to the bottom, where instead, I would like a 1:1
I have tried xtol, jac=True and some others, returning the slope, intercept, r2 but I can't seem to get it to work.
Upvotes: 2
Views: 2273
Reputation:
minimize
looks for a minimum of a given (scalar) objective function. It does not deal with multiobjective problems. It can be used for multiobjective problems only by passing in a single objective function like (slope-1)**2 + (r_value-1)**2 + intercept**2
.
However, in such cases it is preferable to use the specialized minimizer least_squares
, passing in a function that returns the vector [slope-1, r_value-1, intercept]
. If you also want to attach weights [w1, w2, w3]
, return
[w1, w2, w3] * [slope-1, r_value-1, intercept]
instead. So, with weights 3, 4, 5
it would be
def poly_fun(coeffs, a, x):
predicted = 10**np.polynomial.polynomial.polyval(np.log10(a), coeffs)
slope, intercept, r_value, p_value, std_err = scipy.stats.linregress(x, predicted)
return np.array([3, 4, 5]) * np.array([slope-1, r_value-1, intercept])
res = least_squares(poly_fun, x0=original_polynomial, args=(a, x))
There are other loss
functions besides the plain sum-of-squares that one can use in least_squares
: see the docs.
Upvotes: 3