Faref
Faref

Reputation: 63

Does backpropagation use optimization function to update weights?

I know that backpropagation calculates the derivative of the cost function with respect to the parameters of the model (weight and bias). However, I need to make sure that backpropagation does not update weight and bias; instead, it uses OPTIMIZERs in order to update weights and bias like Adam, Gradient Descent, and others

Thanks in advance

Upvotes: 0

Views: 303

Answers (1)

Timbus Calin
Timbus Calin

Reputation: 15053

If I understand well your question: when you use an optimizer in a deep learning framework (PyTorch/TensorFlow), say "Adam", the weights updating are being performed by the optimizer. This process is taking place automatically, you do not need to manually write any code, the framework does the updating weights+biases for you.

Upvotes: 1

Related Questions