Reputation: 55
I'm trying to optimize a scalar function with 10 variables by using scipy.optimize.differential_evolution.
The scalar value is actually calculated by a simulation software which takes about 7 seconds per iteration. The problem is that even if I set the popsize to 10 the algorithm needs >1000 iterations which results in a really long computation time.
The 10 variables are 5 angles and 5 lengths:
phi_1 to phi_5 (0 to 360 degress)
l_1 to l_5 (0 to 20 micrometer)
In each iteration the values (X_1, X_2, X_3, X_4, X_5, Y_1, Y_2, Y_3, Y_4, Y_5) are calculated by
X = l*sin(phi) and Y = l*cos(phi)
The simulation software calculates a scalar based on these inputs.
I tried to reduce the popsize to 3-5 but then the result isn't the global optimum. I also tried different strategies like "rand1exp" and "best1exp" and also gradient based algorithms (SLSQP) before but the problem is that due to the sinus and cosinus functions the starting point is crucial which leads to many different (local) results. I believe that DE is the best algorithm for this problem but I can't imagine that it needs >1000 iterations to solve a problem with "just" 10 variables.
I'm calling DE like this:
sol = differential_evolution(objective, popsize=10, strategy="best1bin", bounds=boundList)
Does anyone have some experience with DE and can give me some hints for the correct parameters?
Upvotes: 2
Views: 1433
Reputation: 407
I am not familiar with differential evolution algorithms, but here are some ideas you may consider:
7 sec per iteration is a lot. I would consider to try to reduce this computation time, whenever possible...
If your criterion is continuous and differentiable, using gradient-based methods would be your first approach. With 10 parameters, gradient based algorithms require at least 10 iterations (if your criterion is quadratic). In practice you should expect much more (hundreds) depending on the difficulty to converge. Other algorithms might also require the same number of iterations.
The fact that your problem is degenerated does not depend on the algorithm. Since phi+2*pi would give exactly the same value of criterion, so you have an infinite number of minima. You need to start not too far from the solution. You may try to provide smart guesses as starting values on your parameters.
Some gradient based methods provided in Scipy allow to use bounds for your parameters.
Upvotes: 2