Reputation: 1739
This question is related to Neuroph Java library.
I have the following program which creates a multi layer perceptron containing a single hidden layer of 20 nodes. The function being learnt is x^2. Backpropagation learning rule is used. However, as is evident from the output, the program doesn't seem to work. The output is always 1. Is there any error in my program?
Program
import org.neuroph.core.NeuralNetwork;
import org.neuroph.core.data.DataSet;
import org.neuroph.nnet.MultiLayerPerceptron;
import org.neuroph.nnet.learning.BackPropagation;
import org.neuroph.util.TransferFunctionType;
public class SquareNeuralNetwork {
public static void main(String[] args) {
NeuralNetwork neuralNetwork = new MultiLayerPerceptron(TransferFunctionType.SIGMOID, 1, 20, 1);
DataSet trainingSet = new DataSet(1, 1);
for (int i = 1; i <= 100; i++) {
trainingSet.addRow(new double[]{i}, new double[]{i * i});
}
BackPropagation backPropagation = new BackPropagation();
backPropagation.setMaxIterations(10);
neuralNetwork.learn(trainingSet, backPropagation);
for (int i = 1; i <= 100; i++) {
neuralNetwork.setInput(i);
neuralNetwork.calculate();
double output = neuralNetwork.getOutput()[0];
System.out.println(i + " - " + output);
}
}
}
Output
1 - 1.0
2 - 1.0
3 - 1.0
4 - 1.0
5 - 1.0
6 - 1.0
7 - 1.0
8 - 1.0
9 - 1.0
10 - 1.0
11 - 1.0
12 - 1.0
Upvotes: 3
Views: 2975
Reputation: 4285
The sigmoid activation function output values in the range:
It seems that you are trying to teach the sigmoid function to output values from 1 to 10000, which is impossible. The best fitness the network can achieve is thus to always output 1's.
You can still teach the neural network to model the exponential function if you remodel the function to 1/x^2 rather than x^2, since this will modify the output range to [0, 1] for x >= 1. When using the network after the training has completed, you have to divide 1 / output in order to get the exponential curve you intended.
I modelled a network with 20 hidden nodes and one hidden layer as a proof of concept :
Upvotes: 4