Reputation: 61
I am using a GA to evaluate a continuous function for a vector with approximately 40,000 variables. Currently I am using a population size of 200 where every member of the population has 40,000 variables. I am using 50 iterations.
With these numbers, the GA is not getting me really close to the optimum solution. I was wondering if there is a way to determine the best population size and number of iterations for a vector of huge size (40,000 variables).
Upvotes: 6
Views: 7332
Reputation:
I have answered a similar question Here. Basically you have a very large number of variables, and an extremely small generation number. I would look into Parallelising your algorithm, Increase your population size, and number of iterations.
Also @Peter Lawrey makes good suggestions.
Upvotes: 2
Reputation: 533880
Yes, it's call trail and error. I suggest starting with a much larger size and see how close you get then decrease the population size repeatedly until you find the point where the size is getting unacceptable results.
There is also check that the population size is the problem. There might be a problem with your algorithm so that given any size and iterations you still don't get an ideal solution.
Upvotes: 3