GJHix
GJHix

Reputation: 522

Why does the neural network stop training because of the gradient?

I am training a neural network and it stopped training due to the gradient stopping condition. From what I can see the gradient 8.14e-0.6 is larger than minimum gradient 1e-0.5, so why did it stop? Is it because the gradient wasn't improving so there was little point continuing?

I am very new to neural networks (and using MATLAB's nntool) so any help/explanation would be much appreciated.

Neural Network Training Performance

Upvotes: 3

Views: 2579

Answers (1)

alfa
alfa

Reputation: 3088

This is not a neural network problem, it is a problem of understanding floating point representations:

8.14e-06 = 8.14×10^−6 = 0.00000814 < 0.00001 = 1.0x10^-5 = 1e-05

Upvotes: 5

Related Questions