Alice_inwonderland
Alice_inwonderland

Reputation: 348

Where does gradient descent appear in machine learning libraries (e.g. scikitlearn)

I understand how gradient descent works and that a user can manually define a gradient descent function to minimize some cost function. My question is very general that where does GD appear in scikitlearn codes when we train and test machine learning models such as linear regression or random forest? Is GD simply embedded in the .fit() function? Or do we need to include it as a parameter to the model?

Upvotes: 1

Views: 310

Answers (3)

Saurabh Jain
Saurabh Jain

Reputation: 1712

I had the same confusion when I started with sklearn. So there are separate implementations of linear regression: vanilla and sgd based.

Straight from the docs:

SGD is merely an optimization technique and does not correspond to a specific family of machine learning models. It is only a way to train a model.

The vanilla implementation can be found in the class sklearn.linear_model.LinearRegression and the sgd implementation can be found in the class sklearn.linear_model.SGDRegressor

The corresponding links can be found here and here

Upvotes: 1

Anant Gupta
Anant Gupta

Reputation: 23

As kate said sklearn models use the stochastic gradient descent and yes when we call fit method, regressor.fit() its when model is optimized and stochastic gradient descent is applied. You can also call it separately but there is no need, here this may help SGD

Upvotes: 1

Kate Melnykova
Kate Melnykova

Reputation: 1873

Short answer: most of the sklearn models use the stochastic gradient descent for optimization/fitting. And you never need to specify that. Some functions allow you to specify optimizer (booster in plain language) like adam.

Upvotes: 1

Related Questions