Simone PANICO
Simone PANICO

Reputation: 13

Step size in scipy optimize minimize

I am running simulations by varying 3 parameters. For each simulation I calculate an index that tells me if the simulation is improving (lower index). To do this, I'm using scipy.optimize.minimize but I'm not sure which method is best (I'm trying to learn more).

def launcher(x0):
    #varying parameters#
    varying_p(x0[0], x0[1], x0[2])
    run_sim(simulation)
    results = import_results(simulation)
    indexes = index_calc(Data_from,Data_to,simulation,results)
    value = indexes['CHI_tot'].iloc[-1]
    return value

Each parameter varies in a range that I indicate with "bounds".

When I launch the simulation each parameter is varied with very small steps. There would be a way to define not only the range of variation but also the steps for each parameter (For example with a step of 50)

This could help to obtain more quickly a result of the optimization?

v_cp = 999.209
v_rho = 1249.94
v_lambda = 0.2815

initialcond = np.array([v_cp, v_rho, v_lambda])
# Bounds: valid only for method L-BFGS-B, TNC, SLSQP, Powell, and trust-constr
bounds = [(400,1600), (400,1600), (0.1, 1.2)]

res = minimize(launcher, initialcond , bounds=bounds, method='L-BFGS-B', options = {'maxiter':10})

print(res.x)

Upvotes: 1

Views: 3164

Answers (1)

Infinity77
Infinity77

Reputation: 1449

All SciPy gradient-based optimizers (L-BFGS-B, SLSQP, etc...) expect - obviously - a gradient of the objective function. If you don’t provide it, they will try to calculate one numerically for you, using some ridiculously small step size (like 10^-6). That’s probably what you’re seeing. A couple of “workarounds”:

  1. I seem to remember that some optimizers allow you to set a step size for gradient calculations (“eps” parameter)

  2. (Better) Normalize your parameters between 0 and 1 when calling the optimizer and de-normalize them before calling the external simulator.

Upvotes: 3

Related Questions