I came across this function in a course i'm taking. what is this part W@X doing?

I came across this function in a course i'm taking. what is this part W@X doing?

def my_loss(w):
    s = 0
    for i in range(0, 569):
        raw_model_output = w@X[i]
        s = s + log_loss(raw_model_output * y[i])
    return s

# Returns the w that makes my_loss(w) smallest
w_fit = minimize(my_loss, X[0]).x
print(w_fit)

# Compare with scikit-learn's LogisticRegression
lr = LogisticRegression(fit_intercept=False, C=1000000).fit(X,y)
print(lr.coef_)

It's an implementation of sklearn's Logistic Regression in Python

Upvotes: 0

Views: 30

Answers (0)

Related Questions