Reputation: 1551
We know that L1 and L2 regularization are solutions to avoid overfitting.
L1 regularization, can lead to sparsity and therefore avoiding fitting to the noise. However, L2 does not.
So I wonder when there is a need to use L2 regularization?
Upvotes: 1
Views: 569
Reputation: 174
L2 penalizes the entire weight coefficients but L1 penalizes some. So, L2 is good for multicollinear inputs and L1 is good for feature selection.
Upvotes: 1