Reputation: 141
I am trying to write a code for GA to minimize the cost of a system, the problem is that the solution converges towards a local minimum and it gets stuck in it so I can't improve my solution anymore.
it might be my selection method that is causing the problem here is what I have:
%----------------------------selection (fittest half) ----------------
probability=ones(1,population/2);
[,IX]=sort(cost(1:population))
dd=sum(1:population);
probability(1:(population/2))=[1:population/2];
probability=fliplr(probability)/dd;
Indexx=IX(1:population/2);
then I use Indexx for crossover etc, can anybody suggest a solution ?
Upvotes: 7
Views: 14627
Reputation: 682
My answer is based on a particular case of a work experience where I used a genetic algorithm. I used pymoo, a very nice python framework to solve Multi Objective problems. I had problem similar to you, maybe not the same, but I want to write it here in case someone is going through the same thing, especially using pymoo.
I had to solve a quadratic problem with many constraints, the problem had 2 objectives and two type of variables, binary and real variables, and the real variables had dependencies with the binary ones in some constraints. I had a custom sampling algorithm, which generated a very varied initial population that met all the restrictions of the problem. I Used pymoo's built-in operators for mutation, crossover, and survival and the problem was exactlty with the survival. Most of the survivals method keep members of the populations who meet the constraints (feasible solution), and the operators of mutation and crossover did not generate feasible population members, then the initial population generated with my sampling could not be overcome in the survival method. Putting the mutation rate as high as possible did not leave the initial population (I was using uniform crossover and bin bitflip mutation). So i had to do my own mutation and crossover algorithms, and also I introduced a repair operator to handle constraints in each iteration!! Perhaps the problem is not the parameter setting, but the setting of the operators and the relationships between them.
Upvotes: 0
Reputation: 91
maybe my answer arrives too late for your needs, anyhow: starting from the point that GA are not meant to find the optimal solution in a search space, if you system gets stuck in a local minimum and assuming that the cone of such minimum is enough steep, the way to get out from it is to 'shake' your population (i.e. introducing noise). An effective technique is to reduce crossover probability and increase mutation as long as the number of steady generations increases. Together with this, or even before starting to cut crossover probability, you can start introducing fresh, random, individuals in your population, increasing their number as long as your steady generations increase. Go back to the initial configuration once the fitness starts to roll again. By the way, a good method to train, while avoiding local minima, is to randomize the allele selected for crossover and to work with large populations, selecting only best 1-2% for the next generation, but of course this higly depends on your search space.
Upvotes: 4
Reputation: 8391
Well, not only the selection/mutation/crossover operators influence the ability no not get stuck in local optima, but also the representation of the solution and the fitness landscape. The operators and representation you can do something about, but overcoming the fitness landscape is tricky. But there are concepts even for this.
Take a look at some diversity preserving mechanisms (e.g. fitness sharing) and I strongly suggest to take a look at Novelty Search. It's a new (well, not anymore probably, but I learned it as new :)) concept where you don't use the fitness for selection at all. There are also combinations of NS and classical fitness-driven search, look at the Mouret's paper at that page's Publications, or look at my master thesis which is all about combining fitness and novelty.
Upvotes: 7
Reputation: 1779
In general, optimization solving algorithms converge to a local minimum. To get out of this local minimum in a Genetic Algorithm, you can use mutations. Mutations are applied to some individuals of a generation. Usually, mutations will be bad and make the result worse and they will not be selected for the next generation, but sometimes, a mutation causes an individual to get close to a different (and sometimes better) local minimum. The higher the mutation rate is, the more 'space' will be searched and the higher the chance that the global minimum is found. There is a catch though; If the mutation rate is too high, the algorithm won't converge anymore.
I hope this was helpful for your problem.
Upvotes: 11