Multivariate Linear Regression

Introduction to Multivariate Linear Regression

In this kind of regression, we have multiple features to predict a single outcome or in other words, a single dependent variable can be explained by multiple independent variables.
In this regression, we will use Gauss Markov setup which has the following assumptions

  • Errors follow the normal distribution with mean 0 and variance σ2I hence: ε~N(0,σ2I)

Hence we can say: for a model containing p regressors and n observations

Where:

  • Y is the matrix of order n×1

Multivariate Linear Regression

Now the error can be given by:

Since in matrix notation, the sum of squares of all the elements in the matrix is given by ∑a2 = aTa
Hence the summation of the squared error can be given by
∑ε2 = εTε
Now substituting

in

Since

and

are scalars, we can write

For ordinary least squared approach ∑ε2 Should be minimum, hence (∂∑ε2)/(∂β)=0

Proof that the estimator we have found is BLUE (Best Linear Unbiased Estimator)

Here we see

that there are both positive and negative errors, hence we can conclude that the error looks like

Hence the distribution of the error looks like

Since errors are centered around zero and the distribution of the error looks like a normal distribution. Hence we can safely assume
ε~N(0,σ2I)
Where I is an Identity matrix of n×n
The residuals are uncorrelated since all the off diagonal elements are 0 which means the covariance between them are 0

This shows that

  • is a linear combination of x
  • Substituting
  • in
  • Since (XTX)-1XTX=I hence the equation becomes
  • Since
  • , we can say that the estimator is unbiased
    Let β- be another unbiased estimator of the regression equation
  • For β- to be an unbiased estimator E(β-) = β-
  • Since (XTX)-1XTX=I hence the equation becomes
  • For the estimator to be unbiased DX=0
  • Now
  • Since DDT is a non negative definite matrix,
  • Since the variance of any other estimator is greater than the variance of our estimator we can conclude that our estimator is the Best Linear Unbiased Estimator

Where:
y pred are our predictions using the regression model

Adjusted R squared (Adjusted R2)

The adjusted R-squared is a modified version of R-squared that has been adjusted for the number of predictors in the model. The adjusted R-squared increases only if the new term improves the model more than would be expected by chance. It decreases when a predictor improves the model by less than expected by chance
The formula for adjusted R2 is given by:

Python code for Multivariate Linear Regression