Gordon
Gordon

Reputation: 446

In neural Networks back propagation, how to get differential equations?

I am confused why dz=da*g'(z)? as we all know, in forward propagation,a=g(z),after taking the derivative of z, I can get da/dz=g'(z),so dz=da*1/g'(z)? Thanks!!

Upvotes: 1

Views: 397

Answers (2)

anand_v.singh
anand_v.singh

Reputation: 2838

The Differential equations come up based on the last layer and then you can build them backwards, the equation as per your last layer can be based on few of the activation functions.

Linear g'(z) = 1 or 1D of 1 vector based on layer dimensions

Sigmoid g'(z) = g(z)*(1-g(z))

Tanh g'(z) = 1 - thanh^2(z)

Relu = 1 if g(z)>0 or else 0

Leaky Relu = 1 if g(z)>0 and whatever leaky relu slope you kept otherwise.

From there you basically have to compute partial gradients for the the previous layers. Check out http://neuralnetworksanddeeplearning.com/chap2.html for a deeper understanding

Upvotes: 0

amityadav
amityadav

Reputation: 194

From what I remember, in many courses, representations like dZ are a shorter way of writing dJ/dZ and and so on. All derivatives are of the cost with respect to various parameters, activations and weighted sums etc.

Upvotes: 2

Related Questions