Reputation: 199
I use encog for java to do a time series prediction, but it does not seem to work at all. I am pretty new to this and i dont know whats wrong.
The red line is the training data (~3600 data entrys) and the blue line is what the Neural Net predicts...
I use the last 250 data points to predict the next one.
Network Structure:
BasicNetwork net = new BasicNetwork();
net.addLayer(new BasicLayer(null, true, 250));
net.addLayer(new BasicLayer(new ActivationSigmoid(), true, 6));
net.addLayer(new BasicLayer(new ActivationSigmoid(), true, 1));
net.setLogic(new FeedforwardLogic());
net.getStructure().finalizeStructure();
net.reset();
final ManhattanPropagation train = new ManhattanPropagation(net, ndata, 0.5);
Also, it does not realy matter how many iterations i do, after the first like 10 iterations it is sticking to one error value constantly.
Upvotes: 0
Views: 515
Reputation: 3762
I dont think your approach will work (my humble view only) in terms of time series prediction with a simple feed forward network, reason being time series prediction needs to understand a pattern of the data over time that occurs frequently and then predict the next data point. My guess is the network is not converging because of (probably) moving data. Use LSTM or Hierarchical Temporal Memory kind of networks for this probably you will get a better result hopefully.
Upvotes: 0
Reputation: 5151
First of all, your ratio of inputs/hidden nodes/output is very unlikely to work. In his books Heaton gives advice to use following rule: suppose that x stands for input nodes. Then hidden neurons number should be x * 2 / 3. For output neurons layer I ( Heaton doesn't give any advice except to try ) as usually use x / 15.
Second part is training algorithm. Manhattan propagation is not as good as rmsprop + backpropogation ( also according to Jeff Heaton books) .
Upvotes: 0