Mayank Gupta
Mayank Gupta

Reputation: 11

Is the given code for gradient descent updating the parameters sequentially or simultaneously?

I'm new to machine learning and I have been learning gradient descent algorithm. I believe this code uses simultaneous update, even though it looks like sequential update. Since the values of partial derivative are calculated before updating either w or b, that is, from the original w and b, the algorithm thus applied on individual w,b is being applied from original values. Am I wrong?


dj_dw=((w*x[i]+b-y[i])*x[i])/m
dj_db=(w*x[i]+b-y[i])/m
w=w-a*dj_dw
b=b-a*dj_db

The language is python3.

x and y are training set.

w and b are parameters that the algorithm is being applied on. I'm using gradient descent algorithm for linear regression.

dj_dw is the partial differentiation of squared mean error cost function with respect to w. Same goes for dj_db.

Apologies for any errors, I'm new.

I tried cross checking using gemini and chatgpt and they said it was sequential, hence the confusion

Upvotes: 0

Views: 72

Answers (1)

Mustufa Abbas
Mustufa Abbas

Reputation: 11

It's simultaneous because both w and b are updated before the next iteration, meaning w and b both have updated values when the next differentiation occurs.

Upvotes: 0

Related Questions