Bobbybobbobbo
Bobbybobbobbo

Reputation: 51

Python equivalent for Matlab's fmincon for constrained nonlinear optimization

I'm currently trying to implement a Python script for solving a constrained nonlinear optimization problem with ~1000 variables and ~700 constraints (both linear and nonlinear). There already exists a Matlab implementation of this script, so I'm essentially just trying to find a Python solver that has equivalent performance to Matlab's fmincon().

The Matlab code has analytic solutions for both the objective and constraint hessian and jacobian, which I've rewritten entirely in Python. I've verified that the Python code is correct by calling the Python hessian/jacobian function from fmincon() and making sure that the same solution is reached. For reference, the Matlab code passes the following options to fmincon():

fmincon options:

   Options used by current Algorithm ('interior-point'):
   (Other available algorithms: 'active-set', 'sqp', 'sqp-legacy', 'trust-region-reflective')

   Set properties:
                    Algorithm: 'interior-point'
               CheckGradients: 0
                      Display: 'none'
         HessianApproximation: 'on'
                   HessianFcn: @(x,lambda)hessian(x,lambda,bCells,d0,rBX,rBY,mask)
       MaxFunctionEvaluations: 1000000
                MaxIterations: 2000
          OptimalityTolerance: 1.0000e-06
    SpecifyConstraintGradient: 1
     SpecifyObjectiveGradient: 1

   Default properties:
           BarrierParamUpdate: 'monotone'
          ConstraintTolerance: 1.0000e-06
     FiniteDifferenceStepSize: 'sqrt(eps)'
         FiniteDifferenceType: 'forward'
           HessianMultiplyFcn: []
                  HonorBounds: 1
               ObjectiveLimit: -1.0000e+20
                    OutputFcn: []
                      PlotFcn: []
                 ScaleProblem: 0
                StepTolerance: 1.0000e-10
          SubproblemAlgorithm: 'factorization'
                     TypicalX: 'ones(numberOfVariables,1)'
                  UseParallel: 0

So far I've attempted to use trust-constr algorithm in scipy.optimize.minimize and cyipopt but neither of these have worked. trust-constr doesn't perform as well and in particular might be struggling with parameter scaling (variables with smaller values are far away from the optimal value), while cyipopt doesn't converge at all. Both of these optimizers are also much slower than fmincon(). Are there any other Python packages that come close to the performance of fmincon()?

Upvotes: 1

Views: 2821

Answers (1)

John Hedengren
John Hedengren

Reputation: 14401

There are several high-quality nonlinear programming solvers in Python. From the comments, it sounds like one of the issues is that the Hessian or Jacobian is not correct in cyipopt. One alternative is to try using a modeling platform such as pyomo or gekko to provide the derivatives. Here is a comparison of fmincon and gekko on the same problem (Hock Schittkowski #71).

Optimization problem

Matlab fmincon

% create file nlcon.m for nonlinear constraints
function [c,ceq] = nlcon(x)
  c = 25.0 - x(1)*x(2)*x(3)*x(4);
  ceq = sum(x.^2) - 40;
objective = @(x) x(1)*x(4)*(x(1)+x(2)+x(3))+x(3);

% initial guess
x0 = [1,5,5,1];

% variable bounds
lb = 1.0 * ones(4);
ub = 5.0 * ones(4);

% linear constraints
A = [];
b = [];
Aeq = [];
beq = [];

% nonlinear constraints
nonlincon = @nlcon;

% optimize with fmincon
%[X,FVAL,EXITFLAG,OUTPUT,LAMBDA,GRAD,HESSIAN] 
% = fmincon(FUN,X0,A,B,Aeq,Beq,LB,UB,NONLCON,OPTIONS)
x = fmincon(objective,x0,A,b,Aeq,beq,lb,ub,nonlincon);

% show final objective
disp(['Final Objective: ' num2str(objective(x))])

% print solution
disp('Solution')
disp(['x1 = ' num2str(x(1))])
disp(['x2 = ' num2str(x(2))])
disp(['x3 = ' num2str(x(3))])
disp(['x4 = ' num2str(x(4))])

Python gekko

Call gekko from MATLAB:

clear all
% Initialize model
m = py.gekko.GEKKO();
% Initialize Variables
x1 = m.Var(pyargs('value',1,'lb',1,'ub',5));
x2 = m.Var(pyargs('value',5,'lb',1,'ub',5));
x3 = m.Var(pyargs('value',5,'lb',1,'ub',5));
x4 = m.Var(pyargs('value',1,'lb',1,'ub',5));
% Define Equations
m.Equation(x1*x2*x3*x4>=25);
m.Equation(x1^2+x2^2+x3^2+x4^2==40);
% Objective
m.Obj(x1*x4*(x1+x2+x3)+x3)
% Solve
m.solve();
% Extract values from Python lists using curly brackets
disp(['x1: ' num2str(x1.VALUE{1})]);
disp(['x2: ' num2str(x2.VALUE{1})]);
disp(['x3: ' num2str(x3.VALUE{1})]);
disp(['x4: ' num2str(x4.VALUE{1})]);

Solve optimization problem directly in Python (not MATLAB) with gekko:

from gekko import GEKKO    
import numpy as np
m = GEKKO()
x = m.Array(m.Var,4,value=1,lb=1,ub=5)
x1,x2,x3,x4 = x
# change initial values
x2.value = 5; x3.value = 5
m.Equation(x1*x2*x3*x4>=25)
m.Equation(x1**2+x2**2+x3**2+x4**2==40)
m.Minimize(x1*x4*(x1+x2+x3)+x3)
m.solve()
print('x: ', x)
print('Objective: ',m.options.OBJFCNVAL)

The modeling language compiles the model into byte-code that includes automatic differentiation for the 1st and 2nd derivatives. It is as fast as if were written in C or Fortran once the model is compiled. Problems with 1000 variables and 700 constraints typically solve in a couple seconds, depending on the degree of nonlinearity.

Upvotes: 1

Related Questions