Reputation: 495
I am trying to code a neural network using the Nguyen-Widrow algorithm for weight initialization. I am quite confused about this matter.
The Nguyen-Widrow algorithm says that at first we calculate the Beta value like:
Beta = 0.7 * ( p ^ ( 1/n ) ) )
with:
p = number of hidden units
n = number of input units
Do we need to count the bias node for n and p too? I mean if the total hidden nodes (without bias node) is 5 then value of p should be 6, is that correct? Or is it still 5?
Thank you
Upvotes: 0
Views: 1567
Reputation: 3497
The bias is treated as any other input, with the only difference that it's value remains constant. The bias has a weight of it's own, which will change with the learning algorithm, and should be included in the initialization algorithm too.
For instance, have a look at MATLAB's documentation:
initnw
is a layer initialization function that initializes a layer's weights and biases according to the Nguyen-Widrow initialization algorithm. This algorithm chooses values in order to distribute the active region of each neuron in the layer approximately evenly across the layer's input space. The values contain a degree of randomness, so they are not the same each time this function is called.
Answer
The bias will count for n: Number of inputs to the layer
The bias will not count for p: Number of nodes in the layer
Note
You may also want to check similar questions:
Neural Network Initialization - Nguyen Widrow Implementation?
Upvotes: 1
Reputation: 586
First of all, there is no bias node (unit) in artificial neural networks. Each node (unit) have a bias input as well as other inputs. So, number of hidden units (p) is constant and in your example is always 5.
The thing that might change when you add bias is number of inputs (n), I searched in some articles and text books, none of them explained it. But from examples I think you should not count bias as an input unit. So if you have a 4 input nodes and a bias, n will be 4.
Upvotes: 0