Garrett Huff
Garrett Huff

Reputation: 81

What is the role of gradient descent in linear regression?

Can someone give me a high level over view of how gradient descent is used in linear regression? I understand that gradient descent basically finds a local minimum efficiently, but how does that actually help to form a regression to data? Can someone give me an order of events in which the line is actually formed to the data points? i understand how to compute the gradient of a point, just not how that actually helps form the line more efficiently.

Upvotes: 1

Views: 114

Answers (1)

Garrett Huff
Garrett Huff

Reputation: 81

I found a solid answer here: https://spin.atomicobject.com/2014/06/24/gradient-descent-linear-regression/

The trick to understanding this is knowing that you must compute the m and b value of y = mx+b to develop a cost function, but that gives you a new data set of the best and worst fitting lines. You then use gradient decent to basically find the lowest error line which contains your actual m and b values for the line. This link has a really good graph (in my non professional machine learning opinion) of what the error graph looks like and if you understand gradient decent you can see how it can travel down the graph to find the lowest error. Sorry for jumping the gun on this question but hopefully it can help some others who are new to machine learning!

Upvotes: 2

Related Questions