Reputation: 769
I have a neural network with a lot of inputs, and i want to train it to realise that only 1 of the inputs matter. First i train it with input[1]=1 and given result 10 then i train with exact same inputs except input[1] = 0 and given result being 0.
I train them until the error is 0 before i switch to the other one, but they just keep changing different weights up and down till the output is equal to the given result, they never figure out that only the weights related to input[1] needs to be concerned about. Is this a common error so to say, that can be bypassed somehow?
Ps. I'm using Sigmoid and derivatives
Upvotes: 0
Views: 129
Reputation: 843
what you are doing is incremental or selective learning. each time you re-train the network on a new data several epochs you are over fitting the new data. if in your case you don't care about the incremental learning and you just care about the result from your data set it is better you use batches from you data set over several iteration until your network converge and doesn't fit the training data.
Upvotes: 1