Reputation: 11865
In the iteration1 image of the following wiki you can see a basic neural network: http://www.heatonresearch.com/wiki/Back_Propagation
If you look at the hidden layer's first neuron (H1) you'll notice that the sum is: -0.5313402159445314 and the output is: 0.3702043582229371
What I can't figure out is how the output was calculated. If I use sigmoid or hyperbolic tangent function on the sum value i get different results.
Thanks
Upvotes: 0
Views: 71
Reputation: 33509
They are using the Sigmoid activation function.
The formula is 1/(1+exp(-x)).
We can check in Python via:
from math import exp
x=-0.5313402159445314
print 1./(1.+exp(-x))
Prints 0.370204358223
Upvotes: 2