Crossfit_Jesus
Crossfit_Jesus

Reputation: 73

Constrained Optimization in Python using Scipy

I have a function f : (a.y1 + b.y2 + c.y3 + d.y4 + e.y5 + f.y6) and I need to minimize the root_mean_squared_error(f). The linear constraint is : a+b+c+d+e+f = 1. And the bounds is that each of a,b,c... should be between [0,1]. I'm trying to find out the optimized values of a,b,c,d,e and f. I am using Scipy, but I feel I have not been able to put the bounds and constraints properly in the 'minimize' code as follows: (y0 is the original test value which is given(say)).

    import numpy as np
    from scipy.optimize import minimize
    from scipy.optimize import LinearConstraint
    def root_mean_squared_error(y1, y2):
        squared_difference = (y1 - y2)**2
        mean_squared = np.mean(squared_difference)
    return np.sqrt(mean_squared)

    def rms(params):
         a, b, c , d , e, f = params
         yF = sum(a*y1 + b*y2 + c*y3 + d*y_4 + e*y5 + f*y6)
         return root_mean_squared_error(y0, yF)
    initial_guess = [0.2, 0.1, 0.2, 0.05, 0.3, 0.15]
    constraint = LinearConstraint([1,1,1,1,1,1],1,1)
    bound = ([0, 1],[0, 1],[0, 1],[0, 1],[0, 1],[0, 1])
    res = minimize(rms, initial_guess, method='nelder-mead', bounds = bound,constraints = constraint)
    print(res.x)

I am getting very small values that don't add up to one, such as the following:

[1.28941447e-04 1.90583408e-04 8.50096927e-05 2.08311702e-04 1.17829816e-04 0.00000000e+00]

Is this the right way to use scipy linear constraint and bound

Upvotes: 0

Views: 1192

Answers (1)

Z Li
Z Li

Reputation: 4318

Constraints (only for COBYLA, SLSQP and trust-constr).

So you probably need to change your method=nelder-mead to one of these otherwise the constraint would be ignored.

Since this is an equality constraint you would have to go with method='SLSQP' or method='trust-constr'

Upvotes: 1

Related Questions