izs
izs

Reputation: 1

Optimization has found the global minimum but converges to a local one

I am using the stochastic optimization algorithm CMA-ES. Although it finds the global minimum in the first cycles ( I know because it is a made-up benchmark test) the algorithm after some cycles converge to another minimum (a local one since it has a bigger cost function value).

Does everyone have experience in the matter?

Do I have to care that it converges to a local minimum since it has found the global one? Is it wrong to just use the global minimum like that and not to care about where the algorithm has converged?

My opinion from the results is that this is happening due to the normal distribution, the global minimum has only a few solutions but the local one has a great percentage of solutions. ( I have tried a lot of different populations values but the result is the same)

Thank you in advance for your help!

Upvotes: 0

Views: 233

Answers (1)

Leo
Leo

Reputation: 1303

It is common to keep a global "best" solution when running evolutionary algorithms, especially if they are the kind that is allowed to move to worse results from a better one.

If you are running the algorithm with an approximate fitness function and getting a good-enough result is okay, you can go with what it converges to. Depending on the problem you are solving, it might be very good or very bad to overfit a solution.

If your fitness function is not an approximation and is the correct metric to optimize, just keep the best performer and use it when you finish running the algorithm.

Upvotes: 1

Related Questions