Hello Mellow
Hello Mellow

Reputation: 169

Having trouble implementing a vectorized + regularized version of logistic regressions gradient descent

So the formula looks like this enter image description here

And my implementation looks like this

grad[0] = ((utils.sigmoid(X.dot(theta))-y).dot(X[:,0])).sum()
grad[1:] = ((utils.sigmoid(X.dot(theta))-y).dot(X[:,1:])).sum()
grad[1:] = grad[1:] + (lambda_*theta[1:])
grad = grad/m

However, the values I get are a bit off (except for grad[0] of course)..

enter image description here

Where did I go wrong in my code?

Upvotes: 1

Views: 203

Answers (1)

Hello Mellow
Hello Mellow

Reputation: 169

Figured it out, I'm an idiot haha. The second sum (in line 2) should not be there as that's 2 columns that should be added.

Also cleaned up the code a bit, but this is the right way to do it

h = utils.sigmoid(X.dot(theta))
grad[0] = (1/m)*((h-y).dot(X[:,0])).sum()
grad[1:] = (1/m)*((h-y).dot(X[:,1:])) + ((lambda_/m)*theta[1:])

Upvotes: 1

Related Questions