Michael
Michael

Reputation: 2773

Neural Network Error Trend

I'm relatively new to neural networks, so I was quite interested when I came across this trend in my data. I have a multilayer perceptron network using back-propagation with no momentum. The learning rate is 0.02, and the minimum margin for error allowed in training is 0.01. The network had to learn how to correctly apply an xor operation on two boolean values (their values either 1 or 0). There are biased neurons to counter the fact that the xor problem does not have linear separability (not sure if I phrased that correctly). The net's neurons are using the Sigmoid transfer function. I was wondering why the graph grows in this trend. After randomizing the network's weights again and training it again, the graph always shows up like this. Why is that? Screenshot of the error graph

Upvotes: 0

Views: 244

Answers (1)

nitbix
nitbix

Reputation: 153

To me that looks correct, and perhaps you're just misreading the graph. On the x-axis you have the iterations. You can consider this the time-axis for simplification. Your y-axis is the error of the network (the lower the better). So as time progresses during training your network is producing better results (with lower error).

Upvotes: 2

Related Questions