NefariousOctopus
NefariousOctopus

Reputation: 877

Genetic algorithm - iterative optimization

Hi I am absolute beginner and I've got rather theoretical question about iterative optimization in genetic algorithms.

Where in the genetic cycle (see below) does the iterative optimization logically belong? I am not really sure, but I think it could be in population initialization or mutation, depending on given problem.

To be more specific about about the iterative optimization algorithm - I would like to use "hill climbing" or "simulated annealing".

I use this model as reference:

enter image description here

Upvotes: 2

Views: 463

Answers (1)

zegkljan
zegkljan

Reputation: 8419

Well, there are several possibilities and basically all of them make sense.

If you put the optimisation phase after population initialization (being run once before the genetic algorithm itself) then what you get is an already optimized initial population. This might be useful because the genetic algorithm does not have to search so much, but it may be harmful in a way that you optimize to some local optima, possibly losing useful information in the process.

If you put the optimisation phase before selection, you get a so-called memetic algorithm. MA is based on the fact that an organism learns throughout its lifetime (the optimization). You have two possibilites of how to do that:

  1. Take an individual, optimize it and replace the original individual with its optimized version. This is called "Lamarckian evolution" and is based on the idea (originally by Jean-Baptiste Lamarck at the beginning of 19th century) that the learned features can be passed on to the offsprings.

  2. Take an individual, optimize it, but then throw the optimized individual away but assign its fitness to the original, unoptimized individual. In this variant the optimization effectively becomes the part of the fitness evaluation process. This is called "Baldwin effect" and is based on the idea (originally by James Mark Baldwin at the end of 19th century) that the learned features cannot be passed on to the offsprings and the genetic information rather describes the ability to learn. By the way, this is the way the natural evolution actually works.

The optimization can, of course, be placed in all the other places, but it is not used (at least I don't know of any such case). However, your problem and mutation/crossover operators might be such that the optimization in those places might be somehow beneficial. Or you can just try and see what you get.

Upvotes: 2

Related Questions