MAE 229 MAE 229 Notes
Topic Pages Starting Page
1 Review of Vectors, Lines and Planes 6 2
2 The Integers mod n, Zn and Fields 3 8
3 Solving a linear system over Zp 3 11
4 Maple Solution of a linear system over Zp 3 14
5 Different Views of a Linear System 2 17
6 Linear Combinations and Span 4 19
7 Linear Dependence and Independence 13 23
8 Subspaces 10 36
9 Bases of Subspaces 7 46
10 Dimension and Rank 10 53
11 Coordinates 6 63
12 Linear Transformations 9 69
13 Linear Transformations, continued 10 78
14 Adjacency Matrix of a Graph 3 88
15 Markov Chains 5 91
16 Eigenvalues and Eigenvectors 16 96
17 Similarity and Diagonalization 10 112
18 Some applications of Diagonalization 12 122
19 Regular Markov Chains 6 134
20 The Power Method 4 140
21 Orthogonal Vectors and Matrices 9 144
22 Orthogonal Complements and Projections 17 153
23 Symmetric Matrices and Quadratic Forms 13 170
24 Least Squares Approximation 11 183
25 The Singular Value Decomposition 14 194
26 Matrix Factorizations and the Pseudoinverse 4 208
27 Vector Spaces 14 212
Dr. Simons –1– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1 Vectors in Rn
v is a vector in Rn means v = [v1 , v2 , . . . , vn ], with v1 , v2 , . . . , vn ∈ R, n ∈ N and n ≥ 1. Poole
uses the row vector [2, 1] to denote a vector and (2, 1) to denote a point to emphasize the
difference between vectors and points. For either row or column vectors we will use square
2
brackets (i.e., [ and ]) to enclose the components of the vectors. Column vectors are
1
typically used in conjunction with matrices.
(a + 2, b + 1)
2 b
−v = 1
→
2 b
P (2, 1)
−v = 1
→
b
(a, b)
O (0, 0)
Figure 1: Coordinates of a vector in R2
1.1 Operations
Let v = [v1 , v2 , . . . , vn ] and w = [w1 , w2, . . . , wn ] be vectors in Rn .
Vector Addition: v+w = [v1 , v2 , . . . , vn ]+[w1 , w2 , . . . , wn ] = [v1 +w1 , v2 +w2 , . . . , vn +wn ]
Scalar Multiplication: For a scalar k, kv = k[v1 , v2 , . . . , vn ] = [kv1 , kv2 , . . . , kvn ]
Dr. Simons –1– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1.2 Properties of Vectors
Let u, v and w be vectors in Rn , and k and m be scalars (real numbers). Then
1. v + w = w + v Commutativity
2. (u + v) + w = u + (v + w) Associativity
3. v + 0 = v 0 is the zero vector
4. v + (−v) = 0 Negative vector
5. k(v + w) = kv + kw Distributivity
6. (k + m)v = kv + mv Distributivity
7. k(mv) = (km)v
8. 1v = v
Many other mathematical objects (matrices, functions, polynomials, etc.) have their own
operations of addition and scalar multiplication that satisfy the same properties, so many
properties of vectors in Rn carry over to these other objects.
Dr. Simons –2– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1.3 Geometry in Rn
Dot Product: v · w = [v1 , v2 , . . . , vn ] · [w1 , w2 , . . . , wn ] = v1 w1 + v2 w2 + · · · + vn wn (a scalar)
√ p
Norm (or Length): kvk = v · v = v12 + v22 + · · · + vn2 , and so kvk2 = v · v.
Angle: The angle θ between vectors v and w is chosen so that 0 ≤ θ ≤ π. Then
v·w
v · w = kvkkwk cos θ (very useful), so if v, w 6= 0 then cos θ = .
kvkkwk
Orthogonal Vectors: Vectors v and w are orthogonal if the angle between them is π/2 (a
right angle). If v 6= 0 and w 6= 0, then v and w are orthogonal iff v · w = 0.
Cauchy-Schwarz inequality: |v · w| ≤ kvkkwk (this implies that | cos θ| ≤ 1)
Triangle inequality: kv + wk ≤ kvk + kwk
Dr. Simons –3– Fall 2020
Topic Pages Starting Page
1 Review of Vectors, Lines and Planes 6 2
2 The Integers mod n, Zn and Fields 3 8
3 Solving a linear system over Zp 3 11
4 Maple Solution of a linear system over Zp 3 14
5 Different Views of a Linear System 2 17
6 Linear Combinations and Span 4 19
7 Linear Dependence and Independence 13 23
8 Subspaces 10 36
9 Bases of Subspaces 7 46
10 Dimension and Rank 10 53
11 Coordinates 6 63
12 Linear Transformations 9 69
13 Linear Transformations, continued 10 78
14 Adjacency Matrix of a Graph 3 88
15 Markov Chains 5 91
16 Eigenvalues and Eigenvectors 16 96
17 Similarity and Diagonalization 10 112
18 Some applications of Diagonalization 12 122
19 Regular Markov Chains 6 134
20 The Power Method 4 140
21 Orthogonal Vectors and Matrices 9 144
22 Orthogonal Complements and Projections 17 153
23 Symmetric Matrices and Quadratic Forms 13 170
24 Least Squares Approximation 11 183
25 The Singular Value Decomposition 14 194
26 Matrix Factorizations and the Pseudoinverse 4 208
27 Vector Spaces 14 212
Dr. Simons –1– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1 Vectors in Rn
v is a vector in Rn means v = [v1 , v2 , . . . , vn ], with v1 , v2 , . . . , vn ∈ R, n ∈ N and n ≥ 1. Poole
uses the row vector [2, 1] to denote a vector and (2, 1) to denote a point to emphasize the
difference between vectors and points. For either row or column vectors we will use square
2
brackets (i.e., [ and ]) to enclose the components of the vectors. Column vectors are
1
typically used in conjunction with matrices.
(a + 2, b + 1)
2 b
−v = 1
→
2 b
P (2, 1)
−v = 1
→
b
(a, b)
O (0, 0)
Figure 1: Coordinates of a vector in R2
1.1 Operations
Let v = [v1 , v2 , . . . , vn ] and w = [w1 , w2, . . . , wn ] be vectors in Rn .
Vector Addition: v+w = [v1 , v2 , . . . , vn ]+[w1 , w2 , . . . , wn ] = [v1 +w1 , v2 +w2 , . . . , vn +wn ]
Scalar Multiplication: For a scalar k, kv = k[v1 , v2 , . . . , vn ] = [kv1 , kv2 , . . . , kvn ]
Dr. Simons –1– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1.2 Properties of Vectors
Let u, v and w be vectors in Rn , and k and m be scalars (real numbers). Then
1. v + w = w + v Commutativity
2. (u + v) + w = u + (v + w) Associativity
3. v + 0 = v 0 is the zero vector
4. v + (−v) = 0 Negative vector
5. k(v + w) = kv + kw Distributivity
6. (k + m)v = kv + mv Distributivity
7. k(mv) = (km)v
8. 1v = v
Many other mathematical objects (matrices, functions, polynomials, etc.) have their own
operations of addition and scalar multiplication that satisfy the same properties, so many
properties of vectors in Rn carry over to these other objects.
Dr. Simons –2– Fall 2020
, MAE 229: Linear Algebra
Review: Vectors, Lines and Planes
1.3 Geometry in Rn
Dot Product: v · w = [v1 , v2 , . . . , vn ] · [w1 , w2 , . . . , wn ] = v1 w1 + v2 w2 + · · · + vn wn (a scalar)
√ p
Norm (or Length): kvk = v · v = v12 + v22 + · · · + vn2 , and so kvk2 = v · v.
Angle: The angle θ between vectors v and w is chosen so that 0 ≤ θ ≤ π. Then
v·w
v · w = kvkkwk cos θ (very useful), so if v, w 6= 0 then cos θ = .
kvkkwk
Orthogonal Vectors: Vectors v and w are orthogonal if the angle between them is π/2 (a
right angle). If v 6= 0 and w 6= 0, then v and w are orthogonal iff v · w = 0.
Cauchy-Schwarz inequality: |v · w| ≤ kvkkwk (this implies that | cos θ| ≤ 1)
Triangle inequality: kv + wk ≤ kvk + kwk
Dr. Simons –3– Fall 2020