chayanquete
chayanquete

Reputation: 13

learning curve and validation curve in neural network

I am trying to build a Neural Network to study one problem with a continuous output variable. A schematic representation of the neural network used is described below

Schematic representation of neural network: input layer size = 1; hidden layer size = 8; output layer size = 1.

I am trying to understand the learning curve (error vs. number of training samples) and validation curve (error vs. regularization parameter lambda).

Learning curves (lambda = 0.01, and lambda = 10) and validation curve.

I am relatively new with machine learning and I was wondering if someone could give me some advice on the analysis of these results. Do you think the learning curve looks ok for lambda = 0.01? Regarding the validation curve, do you also observe a minimum close to \lambda = 0.01? Would you recommend to increase the number of hidden layer?

Thanks in advance,

d

Upvotes: 1

Views: 1125

Answers (1)

mamafoku
mamafoku

Reputation: 1139

With regards to your lambda=10 graph, I believe your learning rate is too large, because the error from training batch should be smaller than error from validation batch.

The curve with lambda=0.01 seems more plausible, but the training error curve is not significantly improving.

I would suggest that you do a step-wise reduction in value of lambda with respect to your cost value. This way you keep adjusting the lambda as your neural network learns.

Upvotes: 1

Related Questions