Effect Of Alpha On Lasso Regression

Often we want conduct a process called regularization, wherein we penalize the number of features in a model in order to only keep the most important features. This can be particularly important when you have a dataset with 100,000+ features.

Lasso regression is a common modeling technique to do regularization. The math behind it is pretty interesting, but practically, what you need to know is that Lasso regression comes with a parameter, alpha, and the higher the alpha, the most feature coefficients are zero.

That is, when alpha is 0, Lasso regression produces the same coefficients as a linear regression. When alpha is very very large, all coefficients are zero.

In this tutorial, I run three lasso regressions, with varying levels of alpha, and show the resulting effect on the coefficients.

Preliminaries

from sklearn.linear_model import Lasso
from sklearn.preprocessing import StandardScaler
import pandas as pd

scaler = StandardScaler()
X = scaler.fit_transform(boston["data"])
Y = boston["target"]
names = boston["feature_names"]

Run Three Lasso Regressions, Varying Alpha Levels

/* Create a function called lasso */
def lasso(alphas):
'''
Takes in a list of alphas. Outputs a dataframe containing the coefficients of lasso regressions from each alpha.
'''
/* Create an empty data frame */
df = pd.DataFrame()

/* Create a column of feature names */
df['Feature Name'] = names

/* For each alpha value in the list of alpha values */
for alpha in alphas:
/* Create a lasso regression with that alpha value */
lasso = Lasso(alpha=alpha)

/* Fit the lasso regression */
lasso.fit(X, Y)

/* Create a column name for that alpha value */
column_name = 'Alpha = %f' % alpha

/* Create a column of coefficient values */
df[column_name] = lasso.coef_

/* Return the datafram  */
return df

/* Run the function called, Lasso */
lasso([.0001, .5, 10])
Feature Name Alpha = 0.000100 Alpha = 0.500000 Alpha = 10.000000
0 CRIM -0.920130 -0.106977 -0.0
1 ZN 1.080498 0.000000 0.0
2 INDUS 0.142027 -0.000000 -0.0
3 CHAS 0.682235 0.397399 0.0
4 NOX -2.059250 -0.000000 -0.0
5 RM 2.670814 2.973323 0.0
6 AGE 0.020680 -0.000000 -0.0
7 DIS -3.104070 -0.169378 0.0
9 TAX -2.074110 -0.000000 -0.0
10 PTRATIO -2.061921 -1.599574 -0.0
11 B 0.856553 0.545715 0.0
12 LSTAT -3.748470 -3.668884 -0.0

Notice that as the alpha value increases, more features have a coefficient of 0.

Special 95% discount

Two Machine Learning Fields

There are two sides to machine learning:

• Practical Machine Learning:This is about querying databases, cleaning data, writing scripts to transform data and gluing algorithm and libraries together and writing custom code to squeeze reliable answers from data to satisfy difficult and ill defined questions. It’s the mess of reality.
• Theoretical Machine Learning: This is about math and abstraction and idealized scenarios and limits and beauty and informing what is possible. It is a whole lot neater and cleaner and removed from the mess of reality.

Data Science Resources: Data Science Recipes and Applied Machine Learning Recipes

Introduction to Applied Machine Learning & Data Science for Beginners, Business Analysts, Students, Researchers and Freelancers with Python & R Codes @ Western Australian Center for Applied Machine Learning & Data Science (WACAMLDS) !!!

Latest end-to-end Learn by Coding Recipes in Project-Based Learning:

Applied Statistics with R for Beginners and Business Professionals

Data Science and Machine Learning Projects in Python: Tabular Data Analytics

Data Science and Machine Learning Projects in R: Tabular Data Analytics

Python Machine Learning & Data Science Recipes: Learn by Coding

R Machine Learning & Data Science Recipes: Learn by Coding

Comparing Different Machine Learning Algorithms in Python for Classification (FREE)

Disclaimer: The information and code presented within this recipe/tutorial is only for educational and coaching purposes for beginners and developers. Anyone can practice and apply the recipe/tutorial presented here, but the reader is taking full responsibility for his/her actions. The author (content curator) of this recipe (code / program) has made every effort to ensure the accuracy of the information was correct at time of publication. The author (content curator) does not assume and hereby disclaims any liability to any party for any loss, damage, or disruption caused by errors or omissions, whether such errors or omissions result from accident, negligence, or any other cause. The information presented here could also be found in public knowledge domains.

A list of Python, R and SQL Codes for Applied Machine Learning and Data  Science at https://setscholars.net/Learn by Coding Categories:

1. Classification: https://setscholars.net/category/classification/
2. Data Analytics: https://setscholars.net/category/data-analytics/
3. Data Science: https://setscholars.net/category/data-science/
4. Data Visualisation: https://setscholars.net/category/data-visualisation/
5. Machine Learning Recipe: https://setscholars.net/category/machine-learning-recipe/
6. Pandas: https://setscholars.net/category/pandas/
7. Python: https://setscholars.net/category/python/
8. SKLEARN: https://setscholars.net/category/sklearn/
9. Supervised Learning: https://setscholars.net/category/supervised-learning/
10. Tabular Data Analytics: https://setscholars.net/category/tabular-data-analytics/
11. End-to-End Data Science Recipes: https://setscholars.net/category/a-star-data-science-recipe/
12. Applied Statistics: https://setscholars.net/category/applied-statistics/
13. Bagging Ensemble: https://setscholars.net/category/bagging-ensemble/
14. Boosting Ensemble: https://setscholars.net/category/boosting-ensemble/
15. CatBoost: https://setscholars.net/category/catboost/
16. Clustering: https://setscholars.net/category/clustering/
17. Data Analytics: https://setscholars.net/category/data-analytics/
18. Data Science: https://setscholars.net/category/data-science/
19. Data Visualisation: https://setscholars.net/category/data-visualisation/
20. Decision Tree: https://setscholars.net/category/decision-tree/
21. LightGBM: https://setscholars.net/category/lightgbm/
22. Machine Learning Recipe: https://setscholars.net/category/machine-learning-recipe/
23. Multi-Class Classification: https://setscholars.net/category/multi-class-classification/
24. Neural Networks: https://setscholars.net/category/neural-networks/
25. Python Machine Learning: https://setscholars.net/category/python-machine-learning/
26. Python Machine Learning Crash Course: https://setscholars.net/category/python-machine-learning-crash-course/
27. R Classification: https://setscholars.net/category/r-classification/
28. R for Beginners: https://setscholars.net/category/r-for-beginners/
30. R for Data Science: https://setscholars.net/category/r-for-data-science/
31. R for Data Visualisation: https://setscholars.net/category/r-for-data-visualisation/
32. R for Excel Users: https://setscholars.net/category/r-for-excel-users/
33. R Machine Learning: https://setscholars.net/category/r-machine-learning/
34. R Machine Learning Crash Course: https://setscholars.net/category/r-machine-learning-crash-course/
35. R Regression: https://setscholars.net/category/r-regression/
36. Regression: https://setscholars.net/category/regression/
37. XGBOOST: https://setscholars.net/category/xgboost/
38. Excel examples for beginners: https://setscholars.net/category/excel-examples-for-beginners/
39. C Programming tutorials & examples: https://setscholars.net/category/c-programming-tutorials/
40. Javascript tutorials & examples: https://setscholars.net/category/javascript-tutorials-and-examples/
41. Python tutorials & examples: https://setscholars.net/category/python-tutorials/
42. R tutorials & examples: https://setscholars.net/category/r-for-beginners/
43. SQL tutorials & examples: https://setscholars.net/category/sql-tutorials-for-business-analyst/