Linear Models in Statistics
Summary
EBB072A05
Semester I B
Wouter Voskuilen
S4916344
Slides by dr. M. Kesina
1
,Wouter Voskuilen Linear Models in Statistics
Contents
1 Chapter 1: Matrix Algebra 3
1.1 Vectors and Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 Rank, Determinant, and Inverse . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3 Eigenvalues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Positive (semi) Definite Matrices . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5 Projection Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.6 Partitioned Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.7 Differentiation Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2 Chapter 2: Random Vectors 15
2.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2 Operations and Transformations . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3 Expected Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.4 Variance-covariance matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.5 Correlation matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.6 General Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3 Chapter 3: Multivariate Normal and Related Distributions 21
3.1 Univariate Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.2 Multivariate Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.3 Related Distributions: χ2 , F , and t distribution . . . . . . . . . . . . . . . . . 23
3.4 Results on Quadratic Forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4 Chapter 4: Linear Model 25
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2 OLS Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.3 Goodness of Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.4 Properties of the OLS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.5 Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.6 Maximum Likelihood Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.7 Frisch-Waugh-Lovell Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.8 Wrong Specification of the Regressor Matrix . . . . . . . . . . . . . . . . . . . 35
2
,Wouter Voskuilen Linear Models in Statistics
1 Chapter 1: Matrix Algebra
1.1 Vectors and Matrices
Let a and b be two vectors, which have the same order n.
Summation of two vectors:
a1 b1 a1 + b1
a2 b2 a2 + b2
a+b= . + . = .
.. .. ..
an bn an + bn
Vector summation is:
− Commutative: a + b = b + a
− Associative: (a + b) + c = a + (b + c), where c is a vector of the same order as a and
b.
Multiplication of a vector with a scalar λ
a1
a2
λa = λ .
..
an
Inner or scalar product of two vectors a and b of the same order n
n
X
⟨a, b⟩ = a′ b = ai bi .
i=1
The length (or norm) of a vector
√
∥a∥ = ⟨a, a⟩1/2 = a′ a.
Any nonzero vector can be normalized by
1
ao = a.
∥a∥
A normalized vector has norm 1.
Collinearity of two vectors a and b
a = λb
for some scalar λ.
3
, Wouter Voskuilen Linear Models in Statistics
Two vectors a and b with ⟨a, b⟩ = 0 are called orthogonal.
If ⟨a, b⟩ = 0 and ∥a∥ = ∥b∥ = 1, then a and b are called orthonormal.
Outer product of two vectors a and b of the same order
a1 b1 · · · a1 bn
ab′ = ... .. ..
. .
an b1 · · · an bn
Let A be a n × m matrix.
− A is square if n = m.
− A is symmetric if n = m and aij = aji , i, j = 1, ..., n.
− A is diagonal if n = m and aij = 0 if i ̸= j, i, j = 1, ..., n.
Result 1
A is symmetric ⇐⇒ A = A′ .
Multiplication of a matrix A with a scalar λ
λA = {λaij }
Multiplication of an n × n matrix A with an m × 1 vector x
m
X
{Ax}i = aiℓ xℓ
ℓ=1
multiplication of an n × n matrix A and an m × k matrix B:
(m )
X
{AB}ij = ail blj
l=1
The resulting matrix AB is of dimension n × k.
Note: The number of columns of the first matrix equals the number of rows in the sec-
ond matrix.
A and B are conformable if their orders are such that AB is defined.
Result 2
For conformable matrices, (ABC)′ = C ′ B ′ A′ .
In the product AB:
4
Summary
EBB072A05
Semester I B
Wouter Voskuilen
S4916344
Slides by dr. M. Kesina
1
,Wouter Voskuilen Linear Models in Statistics
Contents
1 Chapter 1: Matrix Algebra 3
1.1 Vectors and Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 Rank, Determinant, and Inverse . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.3 Eigenvalues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
1.4 Positive (semi) Definite Matrices . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5 Projection Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.6 Partitioned Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.7 Differentiation Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2 Chapter 2: Random Vectors 15
2.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.2 Operations and Transformations . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3 Expected Value . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2.4 Variance-covariance matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
2.5 Correlation matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2.6 General Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3 Chapter 3: Multivariate Normal and Related Distributions 21
3.1 Univariate Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.2 Multivariate Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.3 Related Distributions: χ2 , F , and t distribution . . . . . . . . . . . . . . . . . 23
3.4 Results on Quadratic Forms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
4 Chapter 4: Linear Model 25
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2 OLS Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.3 Goodness of Fit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.4 Properties of the OLS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
4.5 Hypothesis Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
4.6 Maximum Likelihood Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 33
4.7 Frisch-Waugh-Lovell Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
4.8 Wrong Specification of the Regressor Matrix . . . . . . . . . . . . . . . . . . . 35
2
,Wouter Voskuilen Linear Models in Statistics
1 Chapter 1: Matrix Algebra
1.1 Vectors and Matrices
Let a and b be two vectors, which have the same order n.
Summation of two vectors:
a1 b1 a1 + b1
a2 b2 a2 + b2
a+b= . + . = .
.. .. ..
an bn an + bn
Vector summation is:
− Commutative: a + b = b + a
− Associative: (a + b) + c = a + (b + c), where c is a vector of the same order as a and
b.
Multiplication of a vector with a scalar λ
a1
a2
λa = λ .
..
an
Inner or scalar product of two vectors a and b of the same order n
n
X
⟨a, b⟩ = a′ b = ai bi .
i=1
The length (or norm) of a vector
√
∥a∥ = ⟨a, a⟩1/2 = a′ a.
Any nonzero vector can be normalized by
1
ao = a.
∥a∥
A normalized vector has norm 1.
Collinearity of two vectors a and b
a = λb
for some scalar λ.
3
, Wouter Voskuilen Linear Models in Statistics
Two vectors a and b with ⟨a, b⟩ = 0 are called orthogonal.
If ⟨a, b⟩ = 0 and ∥a∥ = ∥b∥ = 1, then a and b are called orthonormal.
Outer product of two vectors a and b of the same order
a1 b1 · · · a1 bn
ab′ = ... .. ..
. .
an b1 · · · an bn
Let A be a n × m matrix.
− A is square if n = m.
− A is symmetric if n = m and aij = aji , i, j = 1, ..., n.
− A is diagonal if n = m and aij = 0 if i ̸= j, i, j = 1, ..., n.
Result 1
A is symmetric ⇐⇒ A = A′ .
Multiplication of a matrix A with a scalar λ
λA = {λaij }
Multiplication of an n × n matrix A with an m × 1 vector x
m
X
{Ax}i = aiℓ xℓ
ℓ=1
multiplication of an n × n matrix A and an m × k matrix B:
(m )
X
{AB}ij = ail blj
l=1
The resulting matrix AB is of dimension n × k.
Note: The number of columns of the first matrix equals the number of rows in the sec-
ond matrix.
A and B are conformable if their orders are such that AB is defined.
Result 2
For conformable matrices, (ABC)′ = C ′ B ′ A′ .
In the product AB:
4