Alex Waese-Perlman
Alex Waese-Perlman

Reputation: 56

Back propagation in a neural network that has multiple output neurons

I am currently following this tutorial http://stevenmiller888.github.io/mind-how-to-build-a-neural-network/ on building a neural network. But I am getting confused on the back propagation section. What am I supposed to do if there are multiple output neurons? Because then there might be multiple output sum margins of error

Delta output sum = S'(sum) * (output sum margin of error) Delta output sum = S'(1.235) * (-0.77) Delta output sum = -0.13439890643886018

Upvotes: 2

Views: 1194

Answers (1)

Lifu Huang
Lifu Huang

Reputation: 12768

The output of a neural network is often a vector(more than one neurons). Generally speaking, what you should do is to define a loss function which map the output vector to a real number. For example, MSE(Mean Square Error) is a simple choice, which just use the 2nd norm(Euclidean distance between output vector and label vector) as the loss value. Then you can just take derivatives as before during backprop. The only difference now is that you should now take partial derivative of a vector function(multivariate function).

Upvotes: 1

Related Questions