Linear regression using python – demystified

Key focus: Let’s demonstrate basics of univariate linear regression using Python SciPy functions. Train the model and use it for predictions.

Linear regression model

Regression is a framework for fitting models to data. At a fundamental level, a linear regression model assumes linear relationship between input variables (x) and the output variable (y). The input variables are often referred as independent variables, features or predictors. The output is often referred as dependent variable, target, observed variable or response variable.

If there are only one input variable and one output variable in the given dataset, this is the simplest configuration for coming up with a regression model and the regression is termed as univariate regression. Multivariate regression extends the concept to include more than one independent variables and/or dependent variables.

Univariate regression example

Let us start by considering the following example of a fictitious dataset. To begin we construct the fictitious dataset by our selves and use it to understand the problem of linear regression which is a supervised machine learning technique. Let’s consider linear looking randomly generated data samples.

import numpy as np
import matplotlib.pyplot as plt #for plotting

np.random.seed(0) #to generate predictable random numbers

m = 100 #number of samples
x = np.random.rand(m,1) #uniformly distributed random numbers
theta_0 = 50 #intercept
theta_1 = 35 #coefficient
noise_sigma = 3

noise = noise_sigma*np.random.randn(m,1) #gaussian random noise

y = theta_0 + theta_1*x + noise #noise added target
 
plt.ion() #interactive plot on
fig,ax = plt.subplots(nrows=1,ncols=1)
plt.plot(x,y,'.',label='training data')
plt.xlabel(r'Feature $x_1$');plt.ylabel(r'Target $y$')
plt.title('Feature vs. Target')
Simulated data for linear regression problem
Figure 1: Simulated data for linear regression problem

In this example, the data samples represent the feature x_1 and the corresponding targets y. Given this dataset, how can we predict target y as a function of x_1? This is a typical regression problem.

Linear regression

Let (x_1^{(i)},y^{(i)}) be the pair that forms one training example (one point on the plot above). Assuming there are m such sample points as training examples, then the set (x_1,y) contains all the m pairs (x_1,y) \in \{(x_1^{(1)},y^{(1)}),(x_1^{(2)},y^{(2)}),\cdots,(x_1^{(m)},y^{(m)})\}.

In the univariate linear regression problem, we seek to approximate the target y as a linear function of the input x, which implies the equation of a straight line (example in Figure 2) as given by

univariate linear regression equation

where, \theta_0 is the intercept, \theta_1 is the slope of the straight line that is sought and x_0 is always 1. The approximated target serves as a guideline for prediction. The approximated target is denoted by \hat{y}

Using all the m samples from the training set (x_1,y), we wish to find the parameters (\theta_0, \theta_1) that well approximates the relationship between the given target samples y and the straight line function \hat{y}.

If we represent the variables \thetas, the input samples for x_1 and the target samples y as matrices, then, equation (1) can be expressed as a dot product between the two sequences

univariate linear regression equation 2

It may seem that the solution for finding (\theta_0, \theta_1) is straight forward

Solution for linear regression

However, matrix inversion is not defined for matrices that are not square. Moore-Penrose pseudo inverse generalizes the concept of matrix inversion to a p \times q matrix. Denoting the Moore-Penrose pseudo inverse for \mathbf{x} as \mathbf{x}^{+}, the solution for finding \mathbf{\theta} is

Solution for linear regression

For coding in Python, we utilize the scipy.linalg.pinv function to compute Moore-Penrose pseudo inverse and estimate (\theta_0, \theta_1).

xMat = np.c_[ np.ones([len(x),1]), x ] #form x matrix
from scipy.linalg import pinv
theta_estimate = pinv(xMat).dot(y)
print(f'theta_0 estimate: {theta_estimate[0]}')
print(f'theta_1 estimate: {theta_estimate[1]}')

The code results in the following estimates for (\theta_0, \theta_1), which are very close to the values used to generate the random data points for this problem.

>> theta_0 estimate: [50.66645323]
>> theta_1 estimate: [34.81080506]

Now, we know the parameters (\theta_0, \theta_1) of our example system, the target predictions for new values of feature x_1 can be done as follows

x_new = np.array([[-0.2],[0.5],[1.2] ]) #new unseen inputs
x_newmat = np.c_[ np.ones([len(x_new),1]), x_new ] #form xNew matrix
y_predict  = np.dot(x_newmat,theta_estimate)
>>> y_predict #predicted y values for new inputs for x_1
array([[43.70429222],
       [68.07185576],
       [92.43941931]])

The approximated target as a linear function of feature, is plotted as a straight line.

plt.plot(x_new,y_predict,'-',label='prediction')
plt.text(0.7, 55, r'Intercept $\theta_0$ = %0.2f'%theta_estimate[0])
plt.text(0.7, 50, r'Coefficient $\theta_1$ = %0.2f'%theta_estimate[1])
plt.text(0.5, 45, r'y= $\theta_0+ \theta_1 x_1$ = %0.2f + %0.2f $x_1$'%(theta_estimate[0],theta_estimate[1]))
plt.legend() #plot legend
Linear Regression - training samples and prediction
Figure 2: Linear Regression – training samples and prediction

Rate this article: PoorBelow averageAverageGoodExcellent (3 votes, average: 3.67 out of 5)

References

[1] Boyd and Vandenberghe , “Convex Optimization”, ISBN: 978-0521833783, Cambridge University Press, 1 edition, March 2004.↗

Related topics

[1] Introduction to Signal Processing for Machine Learning
[2] Generating simulated dataset for regression problems - sklearn make_regression
[3] Hands-on: Basics of linear regression

Books by the author

Wireless Communication Systems in Matlab
Wireless Communication Systems in Matlab
Second Edition(PDF)

PoorBelow averageAverageGoodExcellent (180 votes, average: 3.62 out of 5)

Digital modulations using Python
Digital Modulations using Python
(PDF ebook)

PoorBelow averageAverageGoodExcellent (134 votes, average: 3.56 out of 5)

digital_modulations_using_matlab_book_cover
Digital Modulations using Matlab
(PDF ebook)

PoorBelow averageAverageGoodExcellent (136 votes, average: 3.63 out of 5)

Hand-picked Best books on Communication Engineering
Best books on Signal Processing

Post your valuable comments !!!