Polynomial interpolation using scikit-learn and Python


Whenever we plot our final result after linear regression it is not confirmed that the result will be a straight line.

What will happen if that line is having some multiple degrees? Well, we have one solution to sort this out. First, we need to understand how polynomial regression works?

In this blog, we will see what is polynomial regression, what is the degree and a hands-on code for applying it in our model. In polynomial regression, we actually train our model with extended features. Where we add the powers of features and make a new feature that beats the previous one.

As we know day by day data sets are getting bigger and complex where it is hard to fit the data in a single line. So we use the polynomial feature where we do square or cube or any other power of feature according to demand.

How it works?

  • first, we do all the preprocessing part of the data as usual.
  • Then we fit the data in polynomial regression where we decide the degree for features.
  • After that, we ally linear regression
  • And get the predictions.

Let us see the coding part which is as easier it sounds:

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
from sklearn.preprocessing import PolynomialFeature
from sklearn.linear_model import LinearRegression

#after the preprocessing and splitting part we need to apply the polynomial features in our model.

features = PolynomialFeatures(degree=2 , include_bias=False)
X_poly = features.fit_transform(X_train)

regression = LinearRegression()
print(regression.intercept_ , regression.coef_)

In the code:

–>First we are importing all the necessary libraries.

–>second we are fitting the training data in polynomial features with degree 2, you can use 3 as well it depends on the data-set

–>In the third part we are training the model with linear regression but in the regressor, we are fitting that extracted polynomial feature applied data

–>And at last we are getting the results.


Polynomial-degree?: Well for extending the features we need a degree which is a mathematical value it can be anything, generally, we use 2 or 3 not more than this. Now What this degree actually does? So, In polynomial regression, we add the power of those features so this degree decides what will be the power either square, cube or anything else of that feature.

Conclusion: In linear regression sometimes it is hard to get all the features in a single straight line. So we need to use something which can add all the features and give us the desired output this is what exactly polynomial regression does.

Leave a Reply

Your email address will not be published. Required fields are marked *