eruiz
eruiz

Reputation: 59

Gradient-Based Optimizations in Python

I am trying to solve a couple minimization problems using Python but the setup with constraints is difficult for me to understand. I have:

minimize: x+y+2z^2 subject to: x = 1 and x^2+y^2 = 1

This is very easy obviously and I know the solution is x=1,y=0,z=0. I tried to use scipy.optimize.L-BFGS-B but had issues.

I also have: minimize: 2x1^2+x2^2 subject to: x1+x2=1

I need to use a gradient based optimizer so I chose scipy.optimizer.COBYLA but had issues using an equality constraint as it only takes inequality constraints. The code for this is:

def objective(x):
    x1 = x[0]
    x2 = x[1]
    return 2*(x1**2)+ x2
def constraint1(x):
    return x[0]+x[1]-1
#Try an initial condition of x1=1 and x2=0
#Our initial condition satisfies the constraint already
x0 =  [0.3,0.7]
print(objective(x0))
xnew = [0.25,0.75]
print(objective(xnew))
#Since we have already calculated on paper we know that x1 and x2 fall between 0 and 1
#We can set our bounds for both variables as being between 0 and 1
b = (0,1)
bounds = (b,b)
#Lets make note of the type of constraint we have for out optimizer
con1 = {'type': 'eq', 'fun':constraint1}
cons = [con1]
sol_gradient = minimize(objective,x0,method='COBYLA',bounds=bounds, constraints=cons)

Then I get error about using equality constraints with this optimizer.

Upvotes: 0

Views: 2572

Answers (1)

PiyushC
PiyushC

Reputation: 308

A few things:

  1. Your objective function does not match with the description you have provided. Should it be this: 2*(x1**2) + x2**2?
  2. From the docs scipy.optimize.minimize you can see that COBYLA does not support eq as a constraint. From the page:

Note that COBYLA only supports inequality constraints.

  1. Since you said you want to use a Gradient based optimizer, one option could be to use the Sequential Least Squares Programming (SLSQP) optimizer.

Below is the code replacing 'COBYLA' with 'SLSQP' and changing the objective function according to 1:

def objective(x):
    x1 = x[0]
    x2 = x[1]
    return 2*(x1**2)+ x2**2
def constraint1(x):
    return x[0]+x[1]-1
#Try an initial condition of x1=1 and x2=0
#Our initial condition satisfies the constraint already
x0 =  [0.3,0.7]
print(objective(x0))
xnew = [0.25,0.75]
print(objective(xnew))

#Since we have already calculated on paper we know that x1 and x2 fall between 0 and 1
#We can set our bounds for both variables as being between 0 and 1
b = (0,1)
bounds = (b,b)
#Lets make note of the type of constraint we have for out optimizer
con1 = {'type': 'eq', 'fun':constraint1}
cons = [con1]
sol_gradient = minimize(objective,x0,method='SLSQP',bounds=bounds, constraints=cons)
print(sol_gradient)

Which gives the final answer as:

    fun: 0.6666666666666665
     jac: array([1.33333336, 1.33333335])
 message: 'Optimization terminated successfully'
    nfev: 7
     nit: 2
    njev: 2
  status: 0
 success: True
       x: array([0.33333333, 0.66666667])

Upvotes: 3

Related Questions