Reputation: 1682
I'm implementing neural network with the help of Prof Andrew Ng lectures or this, using figure 31 Algorithm.
I think I understood forward propagation and backward propagation fine, but confuse with updating weight (theta) after each iteration.
Q1. When and HOW to update weight (theta) matrix - theta1, theta2?
Q2. What is big Delta for? [Solved, thanks @xhudik]
Q3. do we have to add +1 (bias unit in input and hidden layer?)
Upvotes: 1
Views: 5711
Reputation: 2444
Q1: is explained by @nikie (Kudos)
Q2: Andrew NG presentations are great. However, you are pointing to one which is high-level and want to understand details. What about this: http://galaxy.agh.edu.pl/~vlsi/AI/backp_t_en/backprop.html It gives you much more details with useful graphics
Upvotes: 6