Reputation: 1373
I'm using scipy's Basin Hopping algorithm to optimize a multivariate cost function. Temperature is one of the parameters that greatly affects convergence time of the basin hopping algorithm. I would like to be able to determine how quickly basinhopping()
is converging by fitting the cost function value curve up to the current iteration and determining if it's a faster convergence than the previous temperature setting.
Here's what the basin hopping call looks like:
res = basinhopping(cost, guess, niter=1, T=t, minimizer_kwargs={"method": "cobyla"})
Is there some way to get "live" updates on the current value of the cost function so that I can do an adaptive optimization?
Upvotes: 2
Views: 608
Reputation: 21947
Do you want to find an optimal T, by e.g. Golden_section_search on the function of one variable:
ftemperature( T ) = basinhopping( ... T=T ... ) .func ?
If so, build up history lists of func
T
and res
that feed your Tsearch
function:
# initialize history lists for a few T (grid search) --
fhist, Thist, reshist = [], [], []
for T in [...]:
res = basinhopping( cost, guess, T=T ... )
print "T %.3g func %.3g" % (T, res.func)
fhist.append( res.func )
Thist.append( T )
reshist.append( res )
# search for new T --
while True:
T = Tsearch( Thist, fhist ) # golden search or ...
if T is None: break
res = basinhopping( cost, guess, T=T ... )
print "T %.3g func %.3g" % (T, res.func)
fhist.append( res.func )
Thist.append( T )
reshist.append( res )
If not, please clarify.
(You could do the same thing inside callback
s, as @Jacob Stevenson says.)
(There are fancier methods for minimizing functions of one variable, see e.g. scipy.optimize.minimize_scalar .)
Upvotes: 1
Reputation: 3756
I'm not 100% sure I understand your question, but the basinhopping parameter callback
sounds like what you're looking for.
On a side note, what you're trying to do sounds a bit similar in concept to this paper Freeze-Thaw Bayesian Optimization
Upvotes: 1