In a recent article, we learned how to reduce variance with regularlization. Another model we can use to reduce features, to solve the problem of colinear features, is to use the Lasso Linear Model. This model will shrink and sometimes remove features so that we only have the features that affect the target data. In this article, we will learn how to fit a Lasso Linear Regression model with Sklearn.
To create a Lasso model, we use the
Lasso class from the
linear_model module. Before fitting the data, we use preprocess and standardized our data so that all the means are 0 and variance is 1.
After that, we create an instance of Lasso with an alpha value of 0.5. Alpha is the "amount" of penalty we want to use. In practice, you will use multiple alphas to compare and choose from.
Finally, we fit the model by passing our x (features) and y (target).
from sklearn.linear_model import Lasso from sklearn.datasets import load_boston from sklearn.preprocessing import StandardScaler boston = load_boston() features = boston.data target = boston.target scaler = StandardScaler() std_feats = scaler.fit_transform(features) regression = Lasso(alpha = 0.5) model = regression.fit(std_feats, target) print(model.score())