Grzenio
Grzenio

Reputation: 36679

Optimization of irregular functions

I have a complicated function defined (4 double parameters), which has a lot of different local optima. I have no reason to think that it should be differentiable either. The only thing I can tell is the hypercube in which the (interesting) optima can be found.

I wrote a really crude and slow algorithm to optimize the function:

public static OptimalParameters brutForce(Model function) throws FunctionEvaluationException, OptimizationException {
    System.out.println("BrutForce");
    double startingStep = 0.02;
    double minStep = 1e-6;
    int steps = 30;

    double[] start = function.startingGuess();
    int n = start.length;
    Comparer comparer = comparer(function);

    double[] minimum = start;
    double result = function.value(minimum);
    double step = startingStep;
    while (step > minStep) {
        System.out.println("STEP step=" + step);
        GridGenerator gridGenerator = new GridGenerator(steps, step, minimum);
        double[] point;
        while ((point = gridGenerator.NextPoint()) != null) {
            double value = function.value(point);
            if (comparer.better(value, result)) {
                System.out.println("New optimum " + value + " at " + model.timeSeries(point));
                result = value;
                minimum = point;
            }
        }
        step /= 1.93;
    }
    return new OptimalParameters(result, function.timeSeries(minimum));
}

private static Comparer comparer(Model model) {
    if (model.goalType() == GoalType.MINIMIZE) {
        return new Comparer() {
            @Override
            public boolean better(double newVal, double optimumSoFar) {
                return newVal < optimumSoFar;
            }
        };
    }
    return new Comparer() {
        @Override
        public boolean better(double newVal, double optimumSoFar) {
            return newVal > optimumSoFar;
        }
    };

}

private static interface Comparer {
    boolean better(double newVal, double optimumSoFar);
}

Note that it is more important to find a better local optimum than speed of the algorithm.

Are there any better algorithms to do this kind of optimization? Would you have any ideas how to improve this design?

Upvotes: 2

Views: 488

Answers (3)

Andreas
Andreas

Reputation: 6475

Your problem sounds as if metaheuristics would be an ideal solution. You can try a metaheuristic such as Evolution Strategies (ES). ES was designed for tough multimodal real vector functions. ES and several real functions are implemented (Rosenbrock, Rastrigin, Ackley, etc.) in our software HeuristicLab. You can implement your own function there and have it optimized. You don't need to add a lot of code and you can directly copy from the other functions that can work as examples for you. You would need to port your code to C# though, but only the evaluation, the other parts are not needed. An advantage is that if you have your function implemented in HeuristicLab you can also try to optimize it with a Particle Swarm Optimization (PSO) method, Genetic Algorithm, or Simulated Annealing which are also already implemented and see which one works best. You only need to implement the evaluation function once.

Or you just scan the literature for papers on Evolution Strategies and reimplement it yourself. IIRC Beyer has implementations on his website - it's written for MatLab.

Upvotes: 0

Andrew
Andrew

Reputation: 551

Try something classic: http://en.wikipedia.org/wiki/Golden_section_search

Upvotes: 0

Andrey Rubshtein
Andrey Rubshtein

Reputation: 20915

You can use simplex based optimization. It is suitable exactly for problems like you have.

If you can use Matlab, at least for the prototyping, try using fminsearch
http://www.mathworks.com/help/techdoc/ref/fminsearch.html

[1] Lagarias, J.C., J. A. Reeds, M. H. Wright, and P. E. Wright, "Convergence Properties of the Nelder-Mead Simplex Method in Low Dimensions," SIAM Journal of Optimization, Vol. 9 Number 1, pp. 112-147, 1998.

Upvotes: 2

Related Questions