Reputation: 15301
I have an AI project, which uses a Backpropagation neural network.
It is training for about 1 hour, and it has trained 60-70 inputs from all 100 inputs. I mean, 60-70 inputs are correct in the condition of Backpropagation. (the number of trained inputs is moving between 60 and 70).
And currently, more than 10000 epochs are completed, and each epoch is taking almost 0.5 seconds.
How to know if the neural network can be trained successfully if I leave it for a long time? (or it can't train better?)
Upvotes: 3
Views: 2885
Reputation: 40345
Check out my answer to this question: whats is the difference between train, validation and test set, in neural networks?
You should use 3 sets of data:
The Validation data set tells you when you should stop (as I said in the other answer):
The validation data set is used to minimize overfitting. You're not adjusting the weights of the network with this data set, you're just verifying that any increase in accuracy over the training data set actually yields an increase in accuracy over a data set that has not been shown to the network before, or at least the network hasn't trained on it (i.e. validation data set). If the accuracy over the training data set increases, but the accuracy over then validation data set stays the same or decreases, then you're overfitting your neural network and you should stop training.
A good method for validation is to use 10-fold (k-fold) cross-validation. Additionally, there are specific "strategies" for splitting your data set into training, validation and testing. It's somewhat of a science in itself, so you should read up on that too.
Regarding your comment on the error, I would point you to some resources which can give you a better understanding of neural networks (it's kinda math heavy, but see below for more info):
Section 5.9 of Colin Fahey article describes it best:
Backward error propagation formula:
The error values at the neural network outputs are computed using the following formula:
Error = (Output - Desired); // Derived from: Output = Desired + Error;
The error accumulation in a neuron body is adjusted according to the output of the neuron body and the output error (specified by links connected to the neuron body). Each output error value contributes to the error accumulator in the following manner:
ErrorAccumulator += Output * (1 - Output) * OutputError;
Upvotes: 8