Farnaz
Farnaz

Reputation: 584

What does it mean if I can not get 0 error on very small training dataset?

In order to validate if the network can potentially learn often people try to overfit on the small dataset.

I can not reach 0 error with my dataset but the output looks like that network memorizes the training set. (MPAE ~1 %)

Is it absolutely necessary to get 0 error in order to prove that my network potentially works on my dataset?

Upvotes: 0

Views: 296

Answers (1)

BenedictWilkins
BenedictWilkins

Reputation: 1263

Short answer: No

Reason:

  1. It may be that a small number of examples are miss labeled. In the case of classification, try to identify which examples it is unable to correctly classify. This will tell you whether your network has learnt all it can.
  2. It can also happen if your data has no pattern that can be learnt - if the data is essentially random.
  3. If the data is noisy, sometimes the noise will mask the features that are required for prediction.
  4. If a dataset is chaotic in the sense that the features vary quickly and dramatically between (and among) labels - if your data follows a very complex (non-smooth) function.

Hope this helps!

Upvotes: 1

Related Questions