NPL
NPL

Reputation: 79

Is there a parallelized version of scipy.optimize.minimize?

I am trying to minimize a cost function using scipy.optimize.minimize, but it is very slow. My function has close to 5000 variables, and so it is not surprising that scipy is slow. However, if there is a parallel version of scipy.optimize.minimize it might help a great deal.

I was wondering if such version of scipy.optimize.minimize exists or if there is any other scipy/numpy tool available for performing minimization of this magnitude. I really appreciate any and all help.

Thanks everybody for their comments. This is a constrained minimization using the SLSQP solver. I have already spent a lot of time making sure that the cost function calculation is optimized, so the problem must be in calculation of the gradient or due to constraints. In other words, the amount of time that is spent on function evaluations is a very small fraction of the total time spent on the minimization.

Upvotes: 7

Views: 5240

Answers (1)

Nairolf
Nairolf

Reputation: 2556

We implemented a parallel version of scipy.optimize.minimize(method='L-BFGS-B') in the package optimparallel available on PyPI. It can speedup the optimization by evaluating the objective function and the (approximate) gradient in parallel. I have not tested it with 5000 parameters. But, with fewer parameters we observe good parallel scaling.

The Python package is a Python implementation of the R package optimParallel. The method is documented in this R journal article.

Here is an illustration of the possible parallel scaling: enter image description here

Upvotes: 4

Related Questions