Elastic Net Regression
This project demonstrates the implementation of Elastic Net Regression, a regularized regression algorithm that combines the strengths of Ridge Regression (L2) and Lasso Regression (L1).
Elastic Net helps reduce overfitting, handles multicollinearity, and can also perform feature selection.
What is Elastic Net Regression?
Elastic Net Regression is a linear regression model that applies both L1 and L2 regularization penalties.
It combines the benefits of:
- Ridge Regression → reduces model complexity and handles multicollinearity
- Lasso Regression → performs feature selection by shrinking some coefficients to zero
Elastic Net balances both techniques to produce a more stable and accurate model.
Why Elastic Net?
Sometimes datasets contain highly correlated features.
Example:
- House Size
- Number of Rooms
- Number of Bedrooms
These variables are strongly related to each other.
Problems with other models:
- Ridge Regression keeps all features and only shrinks coefficients
- Lasso Regression may randomly remove correlated features
Elastic Net solves this by combining both regularization methods.
Mathematical Formula
Elastic Net minimizes the following loss function:
Loss = RSS + λ₁|w| + λ₂w²
or
Loss = RSS + λ ( αL1 + (1 − α)L2 )
Where:
- RSS = Residual Sum of Squares
- L1 = Lasso penalty
- L2 = Ridge penalty
- λ (alpha) = regularization strength
- α (l1_ratio) = balance between Lasso and Ridge
Implementation in Python
from sklearn.linear_model import ElasticNet
from sklearn.model_selection import train_test_split
from sklearn.metrics import r2_score
from sklearn.datasets import load_diabetes
# load dataset
X, y = load_diabetes(return_X_y=True)
# split dataset
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42
)
# create model
model = ElasticNet(alpha=1.0, l1_ratio=0.5)
# train model
model.fit(X_train, y_train)
# prediction
y_pred = model.predict(X_test)
# evaluation
print("R2 Score:", r2_score(y_test, y_pred))
Important Parameters
alpha
Controls the overall strength of regularization.
Higher value → stronger regularization.
l1_ratio
Controls the balance between L1 and L2 penalties.
| Value | Meaning |
|---|---|
| 0 | Ridge Regression |
| 1 | Lasso Regression |
| 0 < value < 1 | Elastic Net |
Advantages of Elastic Net
- Handles multicollinearity
- Performs feature selection
- Reduces overfitting
- Works well with high-dimensional data
Use Cases
Elastic Net is commonly used in:
- Finance prediction models
- Marketing analytics
- Genomics and bioinformatics
- High-dimensional datasets
Evaluation Metric
The model is evaluated using R² Score.
R² measures how well the model explains the variance in the target variable.
Value range:
- 1 → perfect prediction
- 0 → model performs like mean prediction
- negative → poor model
Related Algorithms
This project is part of a Machine Learning Algorithms Series:
- Linear Regression
- Ridge Regression
- Lasso Regression
- Elastic Net Regression
Author
Suraj Singh
Machine Learning & Data Science Learner
Connect With Me
If you found this project useful, feel free to connect and follow my ML learning journey.