mohd faizan umar
mohd faizan umar

Reputation: 1

Feature Scaling is Linear model

How to know that feature scaling is require in Linear Regression, multilinear regression, polynomial regression? Because some where I am getting a point that feature scaling is not required because coefficient is there and somewhere I am getting that feature scaling is required so what's the actual answer.

Upvotes: 0

Views: 488

Answers (2)

Ankish Bansal
Ankish Bansal

Reputation: 1902

Both the statements are correct but incomplete.

If you are using simple linear model such as y = w1 * x1 + w2 * x2 then feature scaling is not required. As the coefficient w1 and w2 will be learned or adapted accordingly.

But if you modify the above expression with the regularization term or defining a constraints over variables, then the coefficient will be biased toward the feature with larger magnitude without feature scaling.

In conclusion: Feature scaling is important when we modify the expression for simple linear model. Also it is a good practice to normalize the features before applying any algorithm.

Upvotes: 2

Roshin Raphel
Roshin Raphel

Reputation: 2689

Suppose we have two features of weight and price, as in the below table. The “Weight” cannot have a meaningful comparison with the “Price.” So the assumption algorithm makes that since “Weight” > “Price,” thus “Weight,” is more important than “Price.” link

Feature scaling is required when the data columns have large variation in their ranges. Getting the min, max and mean of the data in each column is great way

Plotting the data is a next. This identifies the range of the different dimensions of the data easily.

Upvotes: 0

Related Questions