user6787317
user6787317

Reputation: 93

Maximize quadratic objective function subject to linear equality and inequality constraints in R

I'm trying to find how to maximize quadratic function in R with both equality and inequality constraints:

Maximize x' * H * x
subject to: Aeq * x = beq
A * x >= b and x >= 0

A simplified version of the problem could be

Maximize x^2 + y^2
subject to x + y = 1
and x, y >= 0

Since this is a Maximization problem, I am unable to use solve.QP function in the quadprog package.

I tried using constrOptim as well. But note that there is an equality constraint and constrOptim needs a initial guess in the interior of the feasible region. As a result constrOptim cannot be used for equality constraints.

I tried using auglag in the alabama package too. But I don't seem to get the right answer to the maximization problem.

Had the problem been a minimization problem, the answer to the simple problem is x = 0.5 and y = 0.5. Both auglag and solve.QP give me this answer.

But I am looking for the solution to maximization problem. The answer by geometry would lie at (x = 1 and y = 0) OR (x = 0 and y = 1).

Upvotes: 4

Views: 2021

Answers (2)

Felipe Gerard
Felipe Gerard

Reputation: 1622

You can solve this no problem with your available software provided it can deal with non-convex objectives. Two points:

  1. To get around the issue of the non-linear equality constraint, you can do x + y >= 1 and x + y <= 1, which amounts to the same thing as x + y = 1. The same applies for general Aeq*x = b: Aeq*x <= b and Aeq*x >= b
  2. @jogo in the comments is right: you just have to change the sign of the objective function. As it is, you are maximizing a non-bounded convex function (i.e. with [semi]positive definite Hessian matrix), which only makes sense because of the constraints. Therefore, the same holds for minimizing a non-bounded concave function (i.e. with [semi]negative definite Hessian matrix). Just think about it this way: "going up as much as you can is the same as flipping the world upside down and going down as much as you can"

Upvotes: -1

sascha
sascha

Reputation: 33532

This is not a complete answer, but may show you some alternative algorithmic approach.

Problem

This problem seems non-covex, which makes it hard to solve (and limits the amount of good software available).

General Nonlinear Optimization

As Christoph mentioned in the comments, general-nonlinear-optimization is a possible approach. Of course we lose guarantees regarding global-optimal solutions. Something using the excellent open-source software ipopt internally would be a good first try.

Alternative: Convex-Concave/Difference of convex - programming

You might consider convex-concave programming (which solves some easy problems globally) which has a very well working heuristic called the convex-concave procedure (Yuille, Alan L., and Anand Rangarajan. "The concave-convex procedure." Neural computation 15.4 (2003): 915-936.) which should be working better than the more general nonlinear-approach.

I'm not sure if there is a nice way to do this in R (without doing it by hand), but in Python there is the very modern open-source research library dccp (based on cvxpy).

Code

from cvxpy import *
import dccp
from dccp.problem import is_dccp

x = Variable(1)
y = Variable(1)
constraints = [x >= 0, y >= 0, x+y == 1]
objective = Maximize(square(x) + square(y))
problem = Problem(objective, constraints)

print("problem is DCP:", problem.is_dcp())
print("problem is DCCP:", is_dccp(problem))

problem.solve(method='dccp')

print('solution (x,y): ', x.value, y.value)

Output

('problem is DCP:', False)
('problem is DCCP:', True)
iteration= 1 cost value =  -2.22820497851 tau =  0.005
iteration= 2 cost value =  0.999999997451 tau =  0.006
iteration= 3 cost value =  0.999999997451 tau =  0.0072
('solution (x,y): ', 0.99999999872569856, 1.2743612156911721e-09)

Edit/Update

Depending on the size of your problem (small), you could also try global nonlinear solvers like couenne.

Code

from pyomo.environ import *

model = ConcreteModel()

model.x = Var()
model.y = Var()

model.xpos = Constraint(expr = model.x >= 0)
model.ypos = Constraint(expr = model.y >= 0)
model.eq = Constraint(expr = model.x + model.y == 1)
model.obj = Objective(expr = model.x**2 + model.y**2, sense=maximize)

model.preprocess()

solver = 'couenne'
solver_io = 'nl'
stream_solver = True     # True prints solver output to screen
keepfiles =     True    # True prints intermediate file names (.nl,.sol,...)
opt = SolverFactory(solver,solver_io=solver_io)
results = opt.solve(model, keepfiles=keepfiles, tee=stream_solver)

print("Print values for all variables")
for v in model.component_data_objects(Var):
  print str(v), v.value

Output

Couenne 0.5.6 -- an Open-Source solver for Mixed Integer Nonlinear Optimization
Mailing list: [email protected]
Instructions: http://www.coin-or.org/Couenne

Couenne: new cutoff value -1.0000000000e+00 (0.004 seconds)
NLP0012I 
              Num      Status      Obj             It       time                 Location
NLP0014I             1         OPT -0.5        6 0.004
Loaded instance "/tmp/tmpLwTNz1.pyomo.nl"
Constraints:            3
Variables:              2 (0 integer)
Auxiliaries:            3 (0 integer)

Coin0506I Presolve 11 (-1) rows, 4 (-1) columns and 23 (-2) elements
Clp0006I 0  Obj -0.9998 Primal inf 4.124795 (5) Dual inf 0.999999 (1)
Clp0006I 4  Obj -1
Clp0000I Optimal - objective value -1
Clp0032I Optimal objective -1 - 4 iterations time 0.002, Presolve 0.00
Clp0000I Optimal - objective value -1
NLP Heuristic: NLP0014I             2         OPT -1        5 0
no solution.
Clp0000I Optimal - objective value -1
Optimality Based BT: 0 improved bounds
Probing: 0 improved bounds
NLP Heuristic: no solution.
Cbc0013I At root node, 0 cuts changed objective from -1 to -1 in 1 passes
Cbc0014I Cut generator 0 (Couenne convexifier cuts) - 0 row cuts average 0.0 elements, 2 column cuts (2 active)
Cbc0004I Integer solution of -1 found after 0 iterations and 0 nodes (0.00 seconds)
Cbc0001I Search completed - best objective -1, took 0 iterations and 0 nodes (0.01 seconds)
Cbc0035I Maximum depth 0, 0 variables fixed on reduced cost

couenne: Optimal

    "Finished"

Linearization cuts added at root node:         12
Linearization cuts added in total:             12  (separation time: 0s)
Total solve time:                           0.008s (0.008s in branch-and-bound)
Lower bound:                                   -1
Upper bound:                                   -1  (gap: 0.00%)
Branch-and-bound nodes:                         0
Print values for all variables
x 0.0
y 1.0

Upvotes: 3

Related Questions