Reputation: 1321
I want to use a genetic algorithm to solve a simple system of two linear equations with two variables. This is mainly to help me get a better understanding of how they work.
Everything seems pretty simple, but I am unsure how to encode possible solutions in the chromosomes for this problem.
I will have two variables which I want to encode in a chromosome to represent a solution. If each variable is can be represented an 8-bit number, would I make a 16-bit binary encoded chromosome (A string of 1's and 0's).
I am just not quite sure how that would work. If two parents are selected for breeding, how would randomly selecting genes from the binary string result in a possibly better solution? This is why I don't think a binary string would work, so any answers would be greatly appreciated!
Upvotes: 2
Views: 1735
Reputation: 6475
Why not use the numbers as numbers? You don't have to use binary encoding in a GA. There are mutation and crossover operators working well for real-valued encodings. As you say it's a learning example.. I would recommend you try both approaches, the real-valued encoding should be much quicker to converge.
For binary encoding I would use Single Point Crossover and Bit flip Mutation. For real-valued encoding I would use Blend-Alpha-Beta Crossover (BLX-a-b) or Simulated Binary Crossover (SBX) and Normal Distributed Mutation. You can try some of these and many more operators on the SingleObjectiveTestFunctions in HeuristicLab.
Upvotes: 6