Reputation: 1063
I want to use the scipy.optimize.minimize function without specifying my constraints as lambda functions. Is this possible?
i.e. for the standard example:
from scipy.optimize import minimize
def fun(x):
return (x[0] - 1) ** 2 + (x[1] - 2.5)**2.
x = (2, 0)
def c_0(x):
return x[0] - 2. * x[1] + 2.
def c_1(x):
return -x[0] - 2. * x[1] + 6.
def c_2(x):
return -x[0] + 2. * x[1] + 2.
cons = ({'type': 'ineq', 'fun': c_0(x)},
{'type': 'ineq', 'fun': c_1(x)},
{'type': 'ineq', 'fun': c_2(x)})
bnds = ((0, None), (0, None))
res = minimize(fun(x), x, method='SLSQP', bounds=bnds, constraints=cons)
The reason I want to avoid using lambda functions is that my constraint number grows quite quickly for my problem (2*number of degrees of freedom), so unless there's a way of creating a "lambda" factory for my constraints, explicitly writing them will become tedious very quickly.
The above code snippet returns:
TypeError: 'float' object is not callable
Upvotes: 0
Views: 1141
Reputation: 1165
Call the functions without arguments. This works for me:
from scipy.optimize import minimize
def fun(x):
return (x[0] - 1) ** 2 + (x[1] - 2.5)**2.
x = (2, 0)
def c_0(x):
return x[0] - 2. * x[1] + 2.
def c_1(x):
return -x[0] - 2. * x[1] + 6.
def c_2(x):
return -x[0] + 2. * x[1] + 2.
cons = ({'type': 'ineq', 'fun': c_0},
{'type': 'ineq', 'fun': c_1},
{'type': 'ineq', 'fun': c_2})
bnds = ((0, None), (0, None))
res = minimize(fun, x, method='SLSQP', bounds=bnds, constraints=cons)
Upvotes: 1