linker
linker

Reputation: 891

In Lay man's terms what's the difference between a LossFunction and an OptimizationAlgorithm?

I get the part that training a network is all about finding the right weights with Optimization Algorithms deciding how weights are updated until the one needed to get the right prediction is come about.

So the million dollar que$tion$ to the main one are:

(1.) If optimization algorithms updates the weights what do loss functions do to the weights of the network?

(2.) Are loss functions only specific to the output layer of a neural network? (most examples I see with the deeplearning4j framework implement it at the output layer).

P.S: I really want to understand the basic difference between this two in the simplest way possible. I am not looking for anything complex or with some mathematical explosions.

Upvotes: 1

Views: 49

Answers (1)

Franco Piccolo
Franco Piccolo

Reputation: 7420

The optimization algorithm tries to find the minimum of the loss function. At which points the weights are ideal.

Upvotes: 3

Related Questions