Helena
Helena

Reputation: 11

why scipy.optimize.basinhopping give different results

I need to find global minimum of a complex function. I use basinhopping from scipy.optimize. when I changed method, example method="nelder-mead" vs "L-BFGS-B" or initial guess x0, they gave me different results, especially in values of x which I need to get to use in next steps. x[5] = 0.6 with "nelder-mead" but x[5]=0.0008 with "L-BFGS-B" although the function value is similar 2055.7795 vs 2055.7756 (all of these has "success: TRUE"). I thought basinhopping finds global minimum. So it should give the same result, no matter what method or initial guess I use. Anyone can explain why please? and suggest what I should do to find global minimum and check if it is global (not local).

Thank you

Upvotes: 1

Views: 985

Answers (1)

Bob
Bob

Reputation: 14654

The basin-hopping method is not guaranteed to give the global minimum for any function. Also it is not deterministic, at there is a random component in the way it will explore the vicinty, as described in the help about take_step argument a

If you want to reproduce the same result in two different calls in addition to using the same method you must use the same seed parameter.

Also using the same seed should increase the likelihood of giving the same result using different local optimizer methods.

Upvotes: 0

Related Questions