Reputation: 29
I'm using scipy.optimize.minimize 'SLSQP' method, according to the documentation:
bounds : sequence, optional
Bounds for variables (only for L-BFGS-B, TNC and SLSQP). (min, max) pairs for >each element in x, defining the bounds on that parameter. Use None for one of min >or max when there is no bound in that direction.
I'd like to know whether it is possible to define a boundary which is NOT continuous for a variable x like (0,15) & (30,50); (x is between 0 and 15 and between 30 and 50)
Or are there any other better methods to achieve this?
Thank you guys in advance!
Upvotes: 0
Views: 2215
Reputation: 16724
x is between 0 and 15 and between 30 and 50
This would make the model infeasible. There is no such x
. You probably mean:
x is between 0 and 15 OR between 30 and 50
This is non-convex, so standard local solvers have troubles with this. It is often modeled with an extra binary variable:
30 δ ≤ x ≤ 15(1-δ) + 50 δ
δ ∈ {0,1}
Of course, this assumes you can handle binary variables (SLSQP can't). Models with binary variables and nonlinear constraints (or objective function) are called MINLP models (Mixed Integer Non-linear Programming). Solvers for these type of models are readily available.
Some other approaches that may work:
0 ≤ x ≤ 15
and once with 30 ≤ x ≤ 50
. Then pick the best solution.scipy.optimize.basinhopping
global solver to help your way out of a local optimum. This is not a rigorous algorithm (no guarantees), but it can help.Some approaches that typically don't work:
δ ∈ {0,1}
use a continuous variable δ ∈ [0,1]
and add the constraint δ(1-δ)=0
. Typically this will get you stuck.x ∈ [15,30]
. This also does not work with a local solver.Upvotes: 2
Reputation: 879631
Here is an attempt at implementing the basinhopping approach described by Erwin Kalvelagen.
First, construct a polynomial with roots at 0, 15, 30, and 50 which is positive in the desired region:
In [123]: x = np.linspace(-1, 51, 100)
In [124]: plt.plot(x,-(x-0)*(x-15)*(x-30)*(x-50))
Out[124]: [<matplotlib.lines.Line2D at 0x7fa01d65b748>]
In [125]: plt.axhline(color='red')
Out[145]: <matplotlib.lines.Line2D at 0x7fa01d6620b8>
In [146]: plt.show()
Now you can use that polynomial as a constraint:
import numpy as np
import scipy.optimize as optimize
cons = (
{'type': 'ineq', 'fun': lambda x: -(x-0)*(x-15)*(x-30)*(x-50)}, )
def f(x):
return (x-20)**2
res = optimize.basinhopping(f, [40], minimizer_kwargs={'method':'SLSQP', 'constraints': cons}, niter=10, stepsize=20)
print(res)
yields
message: 'Optimization terminated successfully.'
nfev: 13
nit: 4
njev: 4
status: 0
success: True
x: array([15.])
message: ['requested number of basinhopping iterations completed successfully']
minimization_failures: 0
nfev: 178
nit: 10
njev: 56
x: array([15.])
Note that the initial guess was at 40
and yet optimize.basinhopping
managed to find the minimum in the other non-contiguous interval at x = 15
.
Using a stepsize which is on the order of the distance between the two intervals is important to allow basinhopping
an opportunity to sample from both intervals.
Without basinhopping, optimize.minimize
using SLSQP with a non-convex constraint may fail to sample from all admissible intervals. For example,
import scipy.optimize as optimize
cons = ({'type': 'ineq', 'fun': lambda x: -(x-0)*(x-15)*(x-30)*(x-50)}, )
def f(x):
return (x-20)**2
res = optimize.minimize(
f, [40], method='SLSQP', constraints=cons)
print(res.x)
# [30.]
res = optimize.minimize(
f, [5], method='SLSQP', constraints=cons)
print(res.x)
# [15.]
shows you would have to run optimize.minimize
twice with a guess in each interval
in order to find the true minimum.
Upvotes: 2