lOMoARcPSD|6392334
Linear Algebra Summary
P. Dewilde K. Diepold
October 21, 2022
Contents
1 Preliminaries .......................................................................................................................................... 2
1.1 Vector Spaces ......................................................................................................................................... 2
1.2 Bases ...................................................................................................................................................... 4
1.3 Matrices ................................................................................................................................................. 5
1.4 Linear maps represented as matrices ................................................................................................. 6
1.5 Norms on vector spaces...................................................................................................................... 11
1.6 Inner products ..................................................................................................................................... 13
1.7 Definite matrices ................................................................................................................................. 14
1.8 Norms for linear maps ........................................................................................................................ 14
1.9 Linear maps on an inner product space ............................................................................................ 14
1.10 Unitary (Orthogonal) maps .............................................................................................................. 15
1.11 Norms for matrices ........................................................................................................................... 15
1.12 Kernels and Ranges .......................................................................................................................... 16
1.13 Orthogonality .................................................................................................................................... 17
1.14 Projections......................................................................................................................................... 17
1.15 Eigenvalues and Eigenspaces ........................................................................................................... 18
2 Systems of Equations, QR algorithm ................................................................................................ 19
2.1 Jacobi Transformations ...................................................................................................................... 19
For more information. Email:
, lOMoARcPSD|6392334
2.2 Householder Reflection ...................................................................................................................... 20
2.3 QR Factorization .................................................................................................................................. 20
2.3.1 Elimination Scheme based on Jacobi Transformations................................................................. 20
2.3.2 Elimination Scheme based on Householder Reflections............................................................... 21
2.4 Solving the system Tx = b.................................................................................................................... 22
2.5 Least squares solutions ...................................................................................................................... 22
2.6 Application: adaptive QR .................................................................................................................... 24
2.7 Recursive (adaptive) computation .................................................................................................... 25
2.8 Reverse QR........................................................................................................................................... 27
2.9 Francis’ QR algorithm to compute the Schur eigenvalue form ........................................................ 28
3 The singular value decomposition - SVD......................................................................................... 30
3.1 Construction of the SVD...................................................................................................................... 30
3.2 Singular Value Decomposition: proof ................................................................................................ 31
3.3 Properties of the SVD .......................................................................................................................... 32
3.4 SVD and noise: estimation of signal spaces ...................................................................................... 34
3.5 Angles between subspaces ................................................................................................................. 37
3.6 Total Least Square - TLS ..................................................................................................................... 37
1 Preliminaries
In this section we review our basic algebraic concepts and notation, in order to establish a common
vocabulary and harmonize our ways of thinking. For more information on specifics, look up a basic
textbook in linear algebra [1].
1.1 Vector Spaces
A vector space X over R or over C as ’base spaces’ is a set of elements called ’vectors on which
’addition’ is defined with its normal properties (the inverse exists as well as a neutral element called
zero), and on which also ’multiplication with a scalar’ (element of the base space is defined as well,
with a slew of additional properties.
For more information. Email:
, lOMoARcPSD|6392334
Concreteexamplesarecommon:
z
1
-3
5
in R 3 : −3
1 5 x
y
5+ j
in C 3: − 3 − 6j ≈ R6
2+2 j
The addition of vectors belonging to the same Cn (Rn) space is defined as:
and the scalar multiplication: a ∈ R or a ∈ C:
Example
The most interesting case for our purposes is where a vector is actually a discrete time sequence {x(k)
: k = 1···N}. The space that surrounds us and in which electromagnetic waves propagate is mostly
linear. Signals reaching an antenna are added to each other.
Composition Rules:
The following (logical) consistency rules must hold as well: x
+ y = y + x commutativity
(x + y) + z = x + (y + z) associativity
0 neutral element x + (−x)
= 0 inverse
distributivity of ∗ w.r. + consistencies
Vector space of functions
For more information. Email:
, lOMoARcPSD|6392334
Let X be a set and Y a vectorspace and consider the set of functions
X → Y.
We can define a new vectorspace on this set derived from the vectorspace structure of Y:
(f1 + f2)(x) = f1(x) + f2(x)
(af)(x) = af(x).
Examples:
(1)
f1 f2
+ = f1 + f2
(2)
+ =
[x 1 x 2 ··· x n ]+[ y 1 y 2 ··· y n ]=[ x 1 + y 1 x 2 + y 2 ··· x n + y n ]
As already mentioned, most vectors we consider can indeed be interpreted either as continous time
or discrete time signals.
Linear maps
Assume now that both X and Y are vector spaces, then we can give a meaning to the notion ’linear
map’ as one that preserves the structure of vector space:
f(x1 + x2) = f(x1) + f(x2) f(ax) =
af(x)
we say that f defines a ’homomorphism of vector spaces’.
1.2 Bases
We say that a set of vectors {ek} in a vectorspace form a basis, if all its vectors can be expressed as a
unique linear combination of its elements. It turns out that a basis always exists, and that all the bases
of a given vector space have exactly the same number of elements. In Rn or Cn the natural basis is given
by the elements
0
For more information. Email:
Linear Algebra Summary
P. Dewilde K. Diepold
October 21, 2022
Contents
1 Preliminaries .......................................................................................................................................... 2
1.1 Vector Spaces ......................................................................................................................................... 2
1.2 Bases ...................................................................................................................................................... 4
1.3 Matrices ................................................................................................................................................. 5
1.4 Linear maps represented as matrices ................................................................................................. 6
1.5 Norms on vector spaces...................................................................................................................... 11
1.6 Inner products ..................................................................................................................................... 13
1.7 Definite matrices ................................................................................................................................. 14
1.8 Norms for linear maps ........................................................................................................................ 14
1.9 Linear maps on an inner product space ............................................................................................ 14
1.10 Unitary (Orthogonal) maps .............................................................................................................. 15
1.11 Norms for matrices ........................................................................................................................... 15
1.12 Kernels and Ranges .......................................................................................................................... 16
1.13 Orthogonality .................................................................................................................................... 17
1.14 Projections......................................................................................................................................... 17
1.15 Eigenvalues and Eigenspaces ........................................................................................................... 18
2 Systems of Equations, QR algorithm ................................................................................................ 19
2.1 Jacobi Transformations ...................................................................................................................... 19
For more information. Email:
, lOMoARcPSD|6392334
2.2 Householder Reflection ...................................................................................................................... 20
2.3 QR Factorization .................................................................................................................................. 20
2.3.1 Elimination Scheme based on Jacobi Transformations................................................................. 20
2.3.2 Elimination Scheme based on Householder Reflections............................................................... 21
2.4 Solving the system Tx = b.................................................................................................................... 22
2.5 Least squares solutions ...................................................................................................................... 22
2.6 Application: adaptive QR .................................................................................................................... 24
2.7 Recursive (adaptive) computation .................................................................................................... 25
2.8 Reverse QR........................................................................................................................................... 27
2.9 Francis’ QR algorithm to compute the Schur eigenvalue form ........................................................ 28
3 The singular value decomposition - SVD......................................................................................... 30
3.1 Construction of the SVD...................................................................................................................... 30
3.2 Singular Value Decomposition: proof ................................................................................................ 31
3.3 Properties of the SVD .......................................................................................................................... 32
3.4 SVD and noise: estimation of signal spaces ...................................................................................... 34
3.5 Angles between subspaces ................................................................................................................. 37
3.6 Total Least Square - TLS ..................................................................................................................... 37
1 Preliminaries
In this section we review our basic algebraic concepts and notation, in order to establish a common
vocabulary and harmonize our ways of thinking. For more information on specifics, look up a basic
textbook in linear algebra [1].
1.1 Vector Spaces
A vector space X over R or over C as ’base spaces’ is a set of elements called ’vectors on which
’addition’ is defined with its normal properties (the inverse exists as well as a neutral element called
zero), and on which also ’multiplication with a scalar’ (element of the base space is defined as well,
with a slew of additional properties.
For more information. Email:
, lOMoARcPSD|6392334
Concreteexamplesarecommon:
z
1
-3
5
in R 3 : −3
1 5 x
y
5+ j
in C 3: − 3 − 6j ≈ R6
2+2 j
The addition of vectors belonging to the same Cn (Rn) space is defined as:
and the scalar multiplication: a ∈ R or a ∈ C:
Example
The most interesting case for our purposes is where a vector is actually a discrete time sequence {x(k)
: k = 1···N}. The space that surrounds us and in which electromagnetic waves propagate is mostly
linear. Signals reaching an antenna are added to each other.
Composition Rules:
The following (logical) consistency rules must hold as well: x
+ y = y + x commutativity
(x + y) + z = x + (y + z) associativity
0 neutral element x + (−x)
= 0 inverse
distributivity of ∗ w.r. + consistencies
Vector space of functions
For more information. Email:
, lOMoARcPSD|6392334
Let X be a set and Y a vectorspace and consider the set of functions
X → Y.
We can define a new vectorspace on this set derived from the vectorspace structure of Y:
(f1 + f2)(x) = f1(x) + f2(x)
(af)(x) = af(x).
Examples:
(1)
f1 f2
+ = f1 + f2
(2)
+ =
[x 1 x 2 ··· x n ]+[ y 1 y 2 ··· y n ]=[ x 1 + y 1 x 2 + y 2 ··· x n + y n ]
As already mentioned, most vectors we consider can indeed be interpreted either as continous time
or discrete time signals.
Linear maps
Assume now that both X and Y are vector spaces, then we can give a meaning to the notion ’linear
map’ as one that preserves the structure of vector space:
f(x1 + x2) = f(x1) + f(x2) f(ax) =
af(x)
we say that f defines a ’homomorphism of vector spaces’.
1.2 Bases
We say that a set of vectors {ek} in a vectorspace form a basis, if all its vectors can be expressed as a
unique linear combination of its elements. It turns out that a basis always exists, and that all the bases
of a given vector space have exactly the same number of elements. In Rn or Cn the natural basis is given
by the elements
0
For more information. Email: