Search
  • Antoreep Jana

Lasso & Ridge Regression (L1 & L2)

Lasso & Ridge regression techniques are used to counter the overfitting which may result from the model complexity in simple linear regression. These are often known as Regularizers.




Both attempt to perform regularization by modifying the cost function.


Lasso Regression ->




- Leads to feature selection

- With increasing value of the coefficients, the penalty increases, keep the model complexity in check.

- It tends to make the coefficients to absolute 0



Limits ->

- If the number of predictors > number of observations,

- 2 or more highly collinear variables, LASSO regression selects one of them at random.



Ridge Regression ->




- Penalty equivalent to the square of the magnitude of the coefficients.

- Helps to reduce model complexity & multi-collinearity.

- Leads to low bias and low variance.


Limits ->

- Limits the complexity of the model but doesn't reduce the number of variables.


If you liked the post, please appreciate the same. If you want to add something, or suggest some improvement, please do the same in the comment section. Thanks!
1 view0 comments

Recent Posts

See All