Inner product spaces
An inner product on a vector space V is an operation on V that assigns to each pair of vectors ⃗x and ⃗y in
V a real number ⟨⃗x, ⃗y ⟩ satisfying the following conditions:
1.⟨⃗x, ⃗x⟩ ≥ 0 with equality if and only if ⃗x = ⃗0
2.⟨⃗x, ⃗y ⟩ = ⟨⃗y , ⃗x⟩∀⃗x, ⃗y ∈ V
3.⟨α⃗x + β⃗y , ⃗z⟩ = α⟨⃗x, ⃗z⟩ + β⟨⃗y , ⃗z⟩
A vector space V with an inner product ⟨∗, ∗⟩ is called an inner product space.
Some standard inner products:
n
X
Rn : ⟨⃗x, ⃗y ⟩ = xi yi
i=1
Xm X n
Rm×n : ⟨A, B⟩ = aij bij
i=1 j=1
Z b
C[a, b] : ⟨f (x), g(x)⟩ = f (x)g(x)dx
a
n
X
Pn : ⟨p(x), q(x)⟩ = p(xi )q(xi )
i=1
p
Length or norm of ⃗v : ||⃗v || = ⟨⃗v , ⃗v ⟩
Two vectors are orthogonal if their inner product is equal to 0.
If ⃗u, ⃗v are vectors in an inner product space V and ⃗v ̸= 0, then the scalar projection α x and the vector
projection p⃗ of ⃗u onto ⃗v are given by
⟨⃗u, ⃗v ⟩ ⟨⃗u, ⃗v ⟩
α= and p⃗ = ⃗v
||⃗v || ⟨⃗v , ⃗v ⟩
If ⃗v ̸= 0 and p⃗ as above, then ⃗u − p⃗ and p⃗ are orthogonal and ⃗u = p⃗ ⇐⇒ ⃗u = β⃗v
Cauchy-schwarz inequality: |⟨⃗u, ⃗v ⟩| ≤ ||⃗u||||⃗v || with equality holding if and only if ⃗u and ⃗v are linearly
independent.
A vector space V is said to be a normed linear space if to each vector ⃗v ∈ V there is associated a real number
||⃗v || satisfying:
1.||⃗v || ≥ 0 with ||⃗v || = 0 ⇐⇒ ⃗v = ⃗0
2.||α⃗v || = |α|||⃗v ||
⃗ ≤ ||⃗v || + ||w||
3.||⃗v + w|| ⃗
Let ⃗x and ⃗y be vectors in a normed linear space. The distance between them is defined to be the number
||⃗y − ⃗x||
1
, Orthonormal sets
If ⟨⃗
vi , v⃗j ⟩ = 0 whenever i ̸= j, then v⃗1 , . . . v⃗n is said to be an orthogonal set of vectors. The vectors v⃗1 , . . . , v⃗n
are linearly independent.
An orthonormal set of vectors is an orthogonal set of unit vectors. You can make a unit vector out of any
vector by dividing the vector by its norm.
Pn
Let {u⃗1 , . . . , u⃗n } be an orthonormal basis for an inner product space V . If ⃗v = i=1 ci u⃗i , then ci = ⟨⃗v , u⃗i ⟩.
Pn Pn
Let {u⃗1 , . . . , u⃗n } be an orthonormal basis of V . Let ⃗u = i=1 ai u⃗i and ⃗v = i=1 bi u⃗i be in V . Then,
Pn
⟨⃗u, ⃗v ⟩ = i=1 ai bi
p
Formula of parseval: Let {u⃗1 , . . . , u⃗n } be an orthonormal basis of V . Let || ∗ || = ⟨∗, ∗⟩ and consider
Pn Pn 2 (
⃗v = i=1 ci u⃗i . Then it’s norm equals ||⃗v || = i=1 c1 1/2)
Q is an n × n orthogonal matrix ⇐⇒ the column vectors of Q form an orthonormal basis for Rn ⇐⇒
QT Q = I ⇐⇒ QT = Q−1 ⇐⇒ ⟨Q⃗x, Q⃗y ⟩ = ⟨⃗x, ⃗y ⟩ ⇐⇒ ||Q⃗x||λ = ||⃗x||λ
A permutation matrix is a matrix in which the columns of the identity matrix have been re-ordered
If the column vectors of A form an orthonormal set of vectors in Rm , then AT A = I and the solution of the
ˆ = AT ⃗b
least squares problems is ⃗x
Let S be a subspace of an inner product space V and let ⃗x ∈ V . Let {u⃗1 , . . . , u⃗n } be an orthonormal basis
Pn
for S. If p⃗ = i=1 ci u⃗i where ci = ⟨⃗x, u⃗i ⟩ for each i, then p⃗ − ⃗x ∈ S ⊥
Let S be a nonzero subspace of Rm and let ⃗b ∈ Rm . If {u⃗1 , . . . , u⃗k } is an orthonormal basis for S and
U = [u⃗1 , . . . , u⃗k ], then the projection p⃗ of ⃗b onto S is given by p⃗ = U U T ⃗b
Gramm-Schmit orthogonalization
The Gramm-Schmit process lets us take any basis {x⃗1 , . . . , x⃗n } and turn it into an orthonormal basis
{u⃗1 , . . . , u⃗n } in the following manner:
1
1.u⃗1 = x⃗1
||x⃗1
2.p⃗1 = ⟨x⃗2 , u⃗1 ⟩u⃗1
1
3.u⃗2 = (x⃗2 − p⃗1 )
||x⃗2 − p⃗1
4.p⃗2 = ⟨x⃗3 , u⃗1 ⟩u1 + ⟨x⃗3 , u⃗2 ⟩u2
1
5.u⃗3 = (x⃗3 − p⃗2 )
||x⃗3 − p⃗2 ||
6. etcetera
1
In general: Let {x⃗1 , . . . , x⃗n } be a basis of an inner product space. Define u1 = ||x⃗1 || x
⃗1 and uk+1 =
1
Pk
⃗ − p⃗k ) where p⃗k = i=1 ⟨xk+1
⃗ −p⃗k (xk+1
||xk+1 ⃗ , u⃗i ⟩u⃗i
Application of GS: QR factorization
Let A ∈ Rn×n be a matrix. Then there exists an orthogonal matrix Q ∈ Rn×n and an upper triangular
matrix R ∈ Rn×n with positive diagonal entries such that A = QR.
2
An inner product on a vector space V is an operation on V that assigns to each pair of vectors ⃗x and ⃗y in
V a real number ⟨⃗x, ⃗y ⟩ satisfying the following conditions:
1.⟨⃗x, ⃗x⟩ ≥ 0 with equality if and only if ⃗x = ⃗0
2.⟨⃗x, ⃗y ⟩ = ⟨⃗y , ⃗x⟩∀⃗x, ⃗y ∈ V
3.⟨α⃗x + β⃗y , ⃗z⟩ = α⟨⃗x, ⃗z⟩ + β⟨⃗y , ⃗z⟩
A vector space V with an inner product ⟨∗, ∗⟩ is called an inner product space.
Some standard inner products:
n
X
Rn : ⟨⃗x, ⃗y ⟩ = xi yi
i=1
Xm X n
Rm×n : ⟨A, B⟩ = aij bij
i=1 j=1
Z b
C[a, b] : ⟨f (x), g(x)⟩ = f (x)g(x)dx
a
n
X
Pn : ⟨p(x), q(x)⟩ = p(xi )q(xi )
i=1
p
Length or norm of ⃗v : ||⃗v || = ⟨⃗v , ⃗v ⟩
Two vectors are orthogonal if their inner product is equal to 0.
If ⃗u, ⃗v are vectors in an inner product space V and ⃗v ̸= 0, then the scalar projection α x and the vector
projection p⃗ of ⃗u onto ⃗v are given by
⟨⃗u, ⃗v ⟩ ⟨⃗u, ⃗v ⟩
α= and p⃗ = ⃗v
||⃗v || ⟨⃗v , ⃗v ⟩
If ⃗v ̸= 0 and p⃗ as above, then ⃗u − p⃗ and p⃗ are orthogonal and ⃗u = p⃗ ⇐⇒ ⃗u = β⃗v
Cauchy-schwarz inequality: |⟨⃗u, ⃗v ⟩| ≤ ||⃗u||||⃗v || with equality holding if and only if ⃗u and ⃗v are linearly
independent.
A vector space V is said to be a normed linear space if to each vector ⃗v ∈ V there is associated a real number
||⃗v || satisfying:
1.||⃗v || ≥ 0 with ||⃗v || = 0 ⇐⇒ ⃗v = ⃗0
2.||α⃗v || = |α|||⃗v ||
⃗ ≤ ||⃗v || + ||w||
3.||⃗v + w|| ⃗
Let ⃗x and ⃗y be vectors in a normed linear space. The distance between them is defined to be the number
||⃗y − ⃗x||
1
, Orthonormal sets
If ⟨⃗
vi , v⃗j ⟩ = 0 whenever i ̸= j, then v⃗1 , . . . v⃗n is said to be an orthogonal set of vectors. The vectors v⃗1 , . . . , v⃗n
are linearly independent.
An orthonormal set of vectors is an orthogonal set of unit vectors. You can make a unit vector out of any
vector by dividing the vector by its norm.
Pn
Let {u⃗1 , . . . , u⃗n } be an orthonormal basis for an inner product space V . If ⃗v = i=1 ci u⃗i , then ci = ⟨⃗v , u⃗i ⟩.
Pn Pn
Let {u⃗1 , . . . , u⃗n } be an orthonormal basis of V . Let ⃗u = i=1 ai u⃗i and ⃗v = i=1 bi u⃗i be in V . Then,
Pn
⟨⃗u, ⃗v ⟩ = i=1 ai bi
p
Formula of parseval: Let {u⃗1 , . . . , u⃗n } be an orthonormal basis of V . Let || ∗ || = ⟨∗, ∗⟩ and consider
Pn Pn 2 (
⃗v = i=1 ci u⃗i . Then it’s norm equals ||⃗v || = i=1 c1 1/2)
Q is an n × n orthogonal matrix ⇐⇒ the column vectors of Q form an orthonormal basis for Rn ⇐⇒
QT Q = I ⇐⇒ QT = Q−1 ⇐⇒ ⟨Q⃗x, Q⃗y ⟩ = ⟨⃗x, ⃗y ⟩ ⇐⇒ ||Q⃗x||λ = ||⃗x||λ
A permutation matrix is a matrix in which the columns of the identity matrix have been re-ordered
If the column vectors of A form an orthonormal set of vectors in Rm , then AT A = I and the solution of the
ˆ = AT ⃗b
least squares problems is ⃗x
Let S be a subspace of an inner product space V and let ⃗x ∈ V . Let {u⃗1 , . . . , u⃗n } be an orthonormal basis
Pn
for S. If p⃗ = i=1 ci u⃗i where ci = ⟨⃗x, u⃗i ⟩ for each i, then p⃗ − ⃗x ∈ S ⊥
Let S be a nonzero subspace of Rm and let ⃗b ∈ Rm . If {u⃗1 , . . . , u⃗k } is an orthonormal basis for S and
U = [u⃗1 , . . . , u⃗k ], then the projection p⃗ of ⃗b onto S is given by p⃗ = U U T ⃗b
Gramm-Schmit orthogonalization
The Gramm-Schmit process lets us take any basis {x⃗1 , . . . , x⃗n } and turn it into an orthonormal basis
{u⃗1 , . . . , u⃗n } in the following manner:
1
1.u⃗1 = x⃗1
||x⃗1
2.p⃗1 = ⟨x⃗2 , u⃗1 ⟩u⃗1
1
3.u⃗2 = (x⃗2 − p⃗1 )
||x⃗2 − p⃗1
4.p⃗2 = ⟨x⃗3 , u⃗1 ⟩u1 + ⟨x⃗3 , u⃗2 ⟩u2
1
5.u⃗3 = (x⃗3 − p⃗2 )
||x⃗3 − p⃗2 ||
6. etcetera
1
In general: Let {x⃗1 , . . . , x⃗n } be a basis of an inner product space. Define u1 = ||x⃗1 || x
⃗1 and uk+1 =
1
Pk
⃗ − p⃗k ) where p⃗k = i=1 ⟨xk+1
⃗ −p⃗k (xk+1
||xk+1 ⃗ , u⃗i ⟩u⃗i
Application of GS: QR factorization
Let A ∈ Rn×n be a matrix. Then there exists an orthogonal matrix Q ∈ Rn×n and an upper triangular
matrix R ∈ Rn×n with positive diagonal entries such that A = QR.
2