100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Class notes

MA100 Summer Notes

Rating
-
Sold
3
Pages
33
Uploaded on
28-04-2024
Written in
2022/2023

LSE MA100 notes for the summer term, helped me get a 1st.

Institution
Course











Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
Study
Unknown
Course

Document information

Uploaded on
April 28, 2024
Number of pages
33
Written in
2022/2023
Type
Class notes
Professor(s)
Dr ioannis kouletsis
Contains
All classes

Subjects

Content preview

17. Vector Spaces, 1 of 4

1. Vector Space: A real vector space V is a non-empty set equipped with a vector addition
operation and a scalar multiplication operation such that for all α, β ∈ R and all u, v, w
∈V:
● u + v ∈ V (closure under addition)
● u + v = v + u (the commutative law for addition)
● u+(v+w)=(u+v)+w (the associative law for addition)
● Presence of zero vector: Unique member 0 of V, such that for all v ∈ V , v + 0 =
v.
● Presence of Negative v: For every v ∈ V there is an element w ∈ V , usually
written as −v, such that v + w = 0.
● αv ∈ V (closure under scalar multiplication)
● α(u + v) = αu + αv (distributive law)
● (α + β)v = αv + βv (distributive law)
● α(βv) = (αβ)v (associative law for scalar multiplication)
● 1v = v.
2. NOTE: Sets of functions, sets of matrices, sets of polynomials, sets of sequences and
many other sets of mathematical objects can mimic the properties of column vectors in Rn
provided that suitable definitions of vector addition and scalar multiplication are
introduced on these sets.
3. Subspace: Let V be a vector space. Then a non-empty subset W of V is a subspace of V
if and only if both the following conditions hold:
● for all u,v ∈ W, u+v ∈ W (that is,W is closed under vector addition)
● for all v ∈ W and all α ∈ R, αv ∈ W (that is,W is closed under scalar
multiplica- tion)
4. NOTE: Any flat in Rn (that is, point, line, plane, etc) that contains the origin of Rn is a
subspace of the vector space Rn under the standard operations of vector addition and
scalar multiplication. On the other hand, any flat in Rn that does not contain the origin of
Rn is not a subspace of Rn under the same operations.
5. NOTE: A subset S of a vector space V which contains the zero vector of V may or may
not be a subspace of V .

18. Vector Spaces, 2 of 4

1. Linear Span: Suppose that V is a vector space and that the vectors v1, v2, . . . , vk all
belong to V . The linear span of the set X = {v1,v2,...,vk}, denoted by Lin(X) or
Lin{v1,v2,...,vk}, is the set of all linear combinations of the vectors v1, v2, . . . , vk. That
is, Lin{v1,v2,...,vk}={α1v1 +α2v2 +···+αkvk |α1,α2,...,αk ∈ R}.

, 2. NOTE: If X = {v1, v2, . . . , vk} is a set of vectors that all belong to a vector space V ,
then Lin(X) is a subspace of V . In fact, it is the smallest subspace of V containing the
vectors v1,v2,...,vk.
3. Given a set of vectors X = {v1,v2,...,vk} that all belong to a vector space V, we say that X
spans V if Lin(X) = V; that is, if every vector v ∈ V can be written as a linear
combination v=α1v1 +α2v2 +···+αkvk, for some scalars α1,...,αk. To show this, the
matrix needs to have full row rank.
4. Linear Independence: Let V be a vector space and v1,v2,...,vk ∈ V. The vectors
v1,v2,...,vk are called linearly independent if and only if the vector equation α1v1 +α2v2
+···+αkvk =0 has the unique solution α1 =α2 =···=αk =0. Otherwise, it is linearly
dependent. To show linear independence, the matrix (v1,v2,v3) needs to have full
column rank.
5. IMPORTANT: Methodology regarding Linear independence and linear span
● Case (i): If the number of vectors, k, exceeds the dimension n of the vector space
Rn, we have that ρ(A) ≤ n < k. In this case, A does not have full column rank i.e
the vectors cannot be linearly independent. It may have full row rank i.e may or
may not span Rn, depending on whether ρ(A) < n or ρ(A) = n.
● Case (ii): If the number of vectors, k, is less than the dimension n of the vector
space Rn, we have that ρ(A) ≤ k < n. A may or may not have full column rank,
but it certainly does not have full row rank. In other words, the vectors cannot
span Rn. They may or may not be linearly independent, depending on whether
ρ(A) < k or ρ(A) = k.
● Case (iii): If the number of vectors, k, is equal to the dimension n of the vector
space Rn, the rank of A either satisfies ρ(A) = k = n and the vectors are linearly
independent and also span Rn, or ρ(A) < k = n, in which case the vectors are not
linearly independent and do not span Rn.
● Alternatively, we can consider the determinant of the square matrix A. If |A| ≠ 0,
the vectors are linearly independent and span Rn. If |A| = 0, the vectors are not
linearly independent and they do not span Rn.

19. Vector Spaces, 3 of 4

1. Theorem 1: The set {v1, v2, . . . , vk} ⊆ Rn is a linearly dependent set if and only if at
least one vector vi in this set is a linear combination of the remaining vectors.
2. NOTE: in the special case where we have two vectors v1 and v2 in Rn, where n ≥ 2, the
above theorem implies that these vectors are linearly dependent if and only if at least one
of them is a scalar multiple of the other. In other words, at least one of the following two
expressions is valid: α1v1 + v2 = 0 for some scalar α1, or v1 + α2v2 = 0 for some scalar
α2.

, 3. Theorem 2 : A set of vectors {v1, v2, . . . , vk} ⊆ Rn which contains the zero vector is a
linearly dependent set.
4. Theorem 3 : If {v1, v2, . . . , vk} ⊆ Rn is a linearly independent set and if α1v1 +α2v2
+···+αkvk =β1v1 +β2v2 +···+βkvk for some scalars α1,...,αk and β1,...,βk, then
α1 =β1, α2 =β2, ..., αk =βk.
5. NOTE: If we have a linearly dependent set of vectors S = {v1,v2,...,vk} ⊆ Rn and we
add another vector w to S, then the new set T = {v1, v2, . . . , vk, w} is still linearly
dependent.
6. Theorem 4 : If S = {v1, v2, . . . , vk} is a linearly independent set of vectors in Rn and if
w ∈ Rn is not in the linear span of S ; that is, w ∈/ Lin(S), then the new set T = {v1, v2,
. . . , vk, w} is linearly independent.
7. Basis: A subset B = {v1,v2,...,vk} of a vector space V is a basis of V if and only if every
v ∈ V can be expressed as a unique linear combination of the vectors in B. If V is a
vector space, then a basis of V is the smallest spanning, linearly independent set of V .
8. Dimension: The number k of vectors in a finite basis of a vector space V is called the
dimension of V. For example, the dimension of Rn is n.
9. NOTE: If a vector space V has a basis consisting of a finite number of vectors, k, then all
the bases of V contain precisely the same number of vectors, k.
10. NOTE: The vector space {0} is defined to have dimension 0. According to this
convention, {0} has a basis that contains no vectors. The only set without elements is the
empty set ∅, so this is the basis for {0}. In other words, {0} = Lin(∅).
11. NOTE: Any flat in Rn through the origin that has a basis of k vectors is called a
k-dimensional subspace of Rn. In particular, a line through the origin is a
one-dimensional subspace of Rn, a plane through the origin is a two-dimensional
subspace of Rn, and so on, until we reach a hyperplane through the origin, which is an (n
− 1)-dimensional subspace of Rn. On the other hand, the vector space F(R) of all
functions from R to R under the standard operations of pointwise addition and scalar
multiplication has no finite basis.
12. B = {v1,v2,...,vk} is a basis of a k-dimensional vector space V and v is any vector in V ,
the scalars α1, α2, . . . , αk appearing in the unique linear combination v=α1v1 +
α2v2···+αkvk are called the coordinates of v with respect to the basis B and are denoted
by (v)B.



20. Vector Spaces, 4 of 4

, 1. Inner product:




2. Let V be a real vector space. An inner product on V is a function ⟨ , ⟩ from the set {(x, y)}
of vectors in V to the set R which satisfies the following properties:
● ⟨αx+βy,z⟩ = α⟨x,z⟩+β⟨y,z⟩ for all x,y,z ∈ V and all α,β ∈ R.
● ⟨x,y⟩ = ⟨y,x⟩ for all x,y ∈ V.
● ⟨x,x⟩ ≥ 0 for all x ∈ V, and ⟨x,x⟩ = 0 if and only if x=0, the zero vector of the
vector space.
3. Norm:




4. Theorem 2: Let V be an inner product space. Then, for all x, y ∈ V , | ⟨x, y⟩ | ≤ ||x|| ||y||,
where the symbol on the left hand side is the absolute value of ⟨x, y⟩.



5. Angle:
6. Orthogonality: x, y ∈ V are said to be orthogonal if and only if ⟨x, y⟩ = 0. We write x ⊥
y to mean that x, y are orthogonal.
7. Pythagoras Theorem: In an inner product space V , if x, y ∈ V are orthogonal, then ||x
+ y||2 = ||x||2 + ||y||2.
8. Triangle Inequality: In an inner product space V , if x, y ∈ V , then ||x + y|| ≤ ||x|| + ||y||.
$29.44
Get access to the full document:

100% satisfaction guarantee
Immediately available after payment
Both online and in PDF
No strings attached

Get to know the seller
Seller avatar
anshikayadav1

Get to know the seller

Seller avatar
anshikayadav1 London School of Economics
Follow You need to be logged in order to follow users or courses
Sold
3
Member since
3 year
Number of followers
2
Documents
4
Last sold
4 months ago

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions