idpd15
idpd15

Reputation: 456

Understanding of threshold value in a neural network

Consider the hypothetical neural network here

$o_1$ is the output of neuron 1.  
$o_2$ is the output of neuron 2.  
$w_1$ is the weight of connection between 1 and 3.   
$w_2$ is the weight of connection between 2 and 3.  
So the input to neuron 3 is $i =o_1w_1 +o_2w_2$   
Let the activation function of neuron 3 be sigmoid function.  
$f(x) = \dfrac{1}{1+e^{-x}}$ and the threshold value of neuron 3 be $\theta$.  
Therefore, output of neuron 3 will be $f(i)$ if $i\geq\theta$ and $0$ if $i\lt\theta$.  

Am I correct?

Upvotes: 0

Views: 2838

Answers (2)

Adam Johnston
Adam Johnston

Reputation: 1411

Thresholds are used for binary neurons (I forget the technical name), whereas biases are used for sigmoid (and pretty much all modern) neurons. Your understanding of the threshold is correct, but again this is used in neurons where the output is either 1 or 0, which is not very useful for learning (optimization). With a sigmoid neuron, you would simply add the bias (previously the threshold but moved to the other side of the equation), so you're output would be f(weight * input + bias). All the sigmoid function is doing (for the most part) is limiting your output to a value between 0 and 1

Upvotes: 1

Sami Tahri
Sami Tahri

Reputation: 1217

I do not think it is the place to ask this sort of questions. You will find lot of NN ressources online. For your simple case, each link has a weight, so basicly the input of neuron 3 is :

Neuron3Input = Neuron1Output * WeightOfLinkNeuron1To3 + Neuron2Output * WeightOfLinkNeuron2To3 + bias. Then, to get the output, just use the activation function. Neuron3Output = F_Activation(Neuron3Input)

O3 = F(O1 * W1 + O2 * W2 + Bias)

Upvotes: 0

Related Questions