chang dae Kim
chang dae Kim

Reputation: 1

In Gradient Descent algorithm, how to induce -2*wx

part of Gradient Descent algorithm

this.updateWeights = function() {
 
  let wx;
  let w_deriv = 0;
  let b_deriv = 0;

  for (let i = 0; i < this.points; i++) {
    wx = this.yArr[i] - (this.weight * this.xArr[i] + this.bias);
    w_deriv += -2 * wx * this.xArr[i];
    b_deriv += -2 * wx;
  }
  
  this.weight -= (w_deriv / this.points) * this.learnc;
  this.bias -= (b_deriv / this.points) * this.learnc;
}
            

explain this part please!!

-2 * wx * this.xArr[i]

this part is induced....?

how to induce by math formula....

Upvotes: 0

Views: 51

Answers (2)

Yilmaz
Yilmaz

Reputation: 49661

this is the Mean Square Error loss function formula

Mean Square Error

Y with hat represents the guess. In here wx = this.yArr[i] - (this.weight * this.xArr[i] + this.bias) you are calculating the error part.

The goal of gradient descent is to iteratively update the slope (m) and bias (b) (also called weights and intercepts) to minimize the loss function. To update the slope (m) and bias (b), we take the partial derivatives of the loss function with respect to m and b. so we calculate their partial derivatives:

partial derivatives of Mean Square error

so this part -2 * wx * this.xArr[i] you are asking is the partial derivative Mean Square Error with respect to w (weight)

Upvotes: 0

Sen Lin
Sen Lin

Reputation: 468

It's derived from the partial derivative of the MSE loss function with respect to w. I wrote a simplified induction process on paper, hoping it's helpful. attached image

Upvotes: 0

Related Questions