5xum
5xum

Reputation: 5539

Bounded optimization using the Hessian matrix (scipy)

I am trying to optimize a function of a small number of variables (somewhere from 2 to 10). What I am trying to do is calculate the minimum of the function on a bounded hypercube

[0,1] x [0,1] x ... x [0,1]

The calculation of the function, its gradient and its hessian is al relatively simple, quick and accurate.

Now, my problem is this:

Using scipy, I can use either scipy.optimize.minimize(..., method='Newton-CG') or scipy.optimize.minimize(..., method='TNC') to calculate the minimum of the function, however:

Is there any method that will use both?

Upvotes: 0

Views: 1379

Answers (2)

Sturla Molden
Sturla Molden

Reputation: 1144

l-bfgs-b does a bounded optimisation. Like any quasi-Newton method it approximates the Hessian. But this is often better than using the real Hessian.

Upvotes: 1

Moritz
Moritz

Reputation: 5408

Here are a couple of alternatives:

Mystic, a framework which enables constraint optimization by using external constraints (I think, Lagrange multipliers). The package uses scipy.optimize, so it should be possible to use Scipy`s methods with additional constraints.

Ipopt, and its python bindings PyIpopt and CyIpopt. You could look into openopt.

Usually developed for curve fitting, lmfit provides the possibility to add external constraints. It has most solvers from scipy included.

Upvotes: 1

Related Questions