# Assumptions of the classical regression model

Updated: Feb 27

As a series of articles on Predictive Data Analytics, the team Agilytics will be publishing some of the fundamental concepts. The classical normal linear regression model can be used to handle the twin problems of statistical inference i.e.

Estimation

Hypothesis Testing

The classical regression model is based on several simplifying assumptions. These 10 assumptions are as follows: –

**Assumption ->**

** [1]:** The regression model is linear in the parameters.

** [2]:** The values of the regressors, the X’s, are fixed, or X values are independent of the error
term. Here, this means that there is zero co-variance between u and each X variable.

** [3]:** For given X’s, the mean value of disturbance Ui is zero.

** [4]: **For given X’s, the variance of Ui is constant or homoscedastic.

** [5]:** For given X’s, there is no auto-correlation, or serial correlation, between the
disturbances.

** [6]:** The number of observations n must be greater than the number of parameters to be
estimated.

** [7]:** There must be sufficient variation in the values of the X variables.

** [8]:** There is no exact collinearity between X variables.

** [9]:** The model is correctly specified, so there is no specification bias.

** [10]:** The stochastic (disturbance) term Ui is normally distributed.

The Ordinary Least square (OLS) estimators are BLUE (Best Linear Unbiased Estimators) and require above 10 assumptions to be fulfilled.