vincenzodentamaro
vincenzodentamaro

Reputation: 392

HyperNEAT network for Time Series forecasting with Encog help needed

I am using Encog AI Framework for Time Series forecasting using HyperNEAT network.

Here is the simple code I use to create the network.

                Substrate substrate = SubstrateFactory.factorSandwichSubstrate(columns*windowSize,days);                    
                CalculateScore score = new TrainingSetScore(trainingSet);
                NEATPopulation pop = new NEATPopulation(substrate, 500);
                pop.setActivationCycles(4);
                pop.reset();
                EvolutionaryAlgorithm train = NEATUtil.constructNEATTrainer(pop, score);
                OriginalNEATSpeciation speciation = new OriginalNEATSpeciation();
                speciation.setCompatibilityThreshold(1);
                train.setSpeciation(speciation = new OriginalNEATSpeciation());

                System.out.println("Is HyperNEAT "+pop.isHyperNEAT());
                // train the neural network

                int epoch = 1;

                do {
                    train.iteration();
                    if(writeOnStdOut)
                        System.out.println("Epoch #" + epoch + " Error:" + train.getError());
                    epoch++;
                    if(Math.abs(train.getError()-previousError)<0.000000001) iterationWithoutImprovement++; else iterationWithoutImprovement = 0;
                    previousError = train.getError();
                    Date dtemp = new Date();
                } while(train.getError() > maximumAcceptedErrorTreshold && epoch < maxIterations && iterationWithoutImprovement < maxiter);

                NEATGenome genome = (NEATGenome) pop.getBestGenome();
                HyperNEATCODEC codec = new HyperNEATCODEC();
                 network2 = (NEATNetwork) codec.decode(pop, substrate, genome);     

It was taken from Box exampel https://github.com/encog/encog-java-examples/tree/master/src/main/java/org/encog/examples/neural/neat/boxes

Where columns is the number of the features and windowSize is the number of previous days needed to forecast the future value (in my example windowSize is 1).

I get this exception:

Exception in thread "pool-2-thread-416" java.lang.ArrayIndexOutOfBoundsException at org.encog.util.EngineArray.arrayCopy(EngineArray.java:107) at org.encog.neural.neat.NEATNetwork.compute(NEATNetwork.java:194) at org.encog.util.error.CalculateRegressionError.calculateError(CalculateRegressionError.java:46) at org.encog.neural.networks.training.TrainingSetScore.calculateScore(TrainingSetScore.java:61) at org.encog.ml.ea.score.parallel.ParallelScoreTask.run(ParallelScoreTask.java:83) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.lang.Thread.run(Unknown Source)

It seems that there are problems handling threads. Can someone help me solving this problem? My second question is: how can I train a NEAT network with Backpropagation in Encog?

Upvotes: 0

Views: 1342

Answers (2)

net_programmer
net_programmer

Reputation: 372

I have declared this

Substrate substrate = SubstrateFactory.factorSandwichSubstrate((int)Math.sqrt(NDataSetFeatures),1);

where the last parameter is the Class, and this works for me.

Upvotes: 0

JeffHeaton
JeffHeaton

Reputation: 3288

As to the array out of bounds exception. I looked at that line and the only thing that can cause it is that that you are sending in an input vector that has more elements than you have input neurons for. I would make sure that you are defining the neural network to be of the same input dimensions as your data ultimately ends up being.

As to backpropagation and NEAT/HyperNEAT, that is not how these networks are designed to be trained. At least not the Kenneth Stanley implementations work. It is all genetic training. There might be a way to fine tune a NEAT network with backprop, but I have not attempted it.

Upvotes: 1

Related Questions