wonderboy3489
wonderboy3489

Reputation: 37

SciPy Minimize: How do I print the value of the finite difference Jacobian?

Is there a way to get scipy.optimize.minimize to print the value of the estimated gradient at each iteration? I'd like to compare it to the value of the analytical gradient I am computing.

Upvotes: 2

Views: 622

Answers (1)

JoshAdel
JoshAdel

Reputation: 68702

You can supply a callback function to scipy.optimize.minimize, which gets called after each iteration in combination with the function that the various methods use internally to do the numerical approximation (scipy.optimize.optimize._approx_fprime_helper). Using the rosen function as an example:

import numpy as np
from scipy.optimize import rosen, rosen_der, minimize
from scipy.optimize.optimize import _approx_fprime_helper

def callback(x):
    print 'exact: ', rosen_der(x)
    print 'approx: ', _approx_fprime_helper(x, rosen, 1E-8)
    print '-----'

x0 = np.zeros(5)
res = minimize(rosen, x0, method='L-BFGS-B', callback=callback)

This would give you something like:

exact:  [-2.11963396  1.84037029  1.84037037  2.00372223 -0.08167787]
approx:  [-2.11963398  1.84037128  1.84037137  2.00372323 -0.08167684]
-----
exact:  [-2.09674976  0.65207886  0.77546647  0.73540194  0.02017962]
approx:  [-2.0967498   0.65207981  0.77546742  0.73540298  0.02018057]
-----
exact:  [-1.89973856 -1.67615541 -0.88726966 -1.04665196  0.08082156]
approx:  [-1.8997385  -1.67615442 -0.88726866 -1.04665099  0.08082255]
-----
exact:  [ 0.71591999 -7.99959011 -2.81299766 -3.18692904  0.18447144]
approx:  [ 0.71592021 -7.99958908 -2.81299664 -3.18692805  0.18447244]
-----

If you dig into the optimizer source code you'll see where _approx_fprime_helper is called. It's defined here:

https://github.com/scipy/scipy/blob/master/scipy/optimize/optimize.py#L601

Make sure you match the eps value that is defined in the call to minimize with what you pass to _approx_fprime_helper.

Upvotes: 1

Related Questions