Project 1
1. INTRODUCTION
Linear algebra provides a foundation for understanding vectors in various contexts,
extending beyond traditional numeric representations. In this project, we will explore
vectors and vector spaces in a broader context, emphasizing the diverse structures
applicable. We will apply Polya's problem-solving method and utilize advanced
calculators when necessary.
2. VECTOR SPACE
Polynomials constitute a vector space over the real numbers, satisfying the fundamental
axioms of a vector space:
a. Additive Identity: The zero polynomial, denoted as 0( x)=0 for all values of
x , acts as the additive identity. For any polynomial p(x ), adding the zero
polynomial results in the original polynomial:
0( x)+ p(x)=p ( x)
b. Closed Under Addition: Adding two polynomials results in another
polynomial. Let p( x )∧q ( x) be any two polynomials. Their sum, p(x )+ q( x ),
is also a polynomial.
c. Closed Under Multiplication: Multiplying two polynomials p( x )∧q ( x)
results in another polynomial.
d. Unitary Element: The scalar constant 1 acts as the unitary element.
Multiplying a polynomial p(x ) by 1 results in the original polynomial:
1 ∗ p( x )= p(x )
e. Additive Inverse: For every polynomial p(x ), there exists an additive
inverse, denoted by − p( x), such that adding them results in the zero
polynomial:
, p(x )+(− p (x))=0(x )
f. Commutativity under Addition: Adding polynomials is commutative,
meaning the order of addition doesn't change the result:
p( x )+ q(x )=q ( x)+ p( x )
g. Associativity under Addition and Scalar Multiplication: Addition and
scalar multiplication of polynomials are associative since for any polynomial
p( x ), q ( x)∧r (x):
( p( x )+ q(x ))+r (x)= p( x)+(q(x )+ r (x))
And
a ∗(b ∗ p ( x))=(a ∗b)∗ p(x )
h. Distributivity: For any polynomials p( x ), q ( x) ,∧r ( x):
a ⋅(p ( x)+q ( x))=a⋅ p (x)+a ⋅ q ( x)
And
(a+ b)⋅ p( x )=a⋅ p( x)+b ⋅ p( x )
Span of Polynomials:
The span of a set of polynomials S, denoted as span(S), is the smallest subspace of a set
of polynomials that contains all the polynomials in S. It is the collection of all
polynomials that can be formed by adding and subtracting multiples of the original
polynomials. Hence it represents the set of all possible linear combinations of those
polynomials.
If P= { p 1 (x ) , p2 ( x ) ,… , pn ( x ) }is a set of polynomials, then the span of P is the set of all
polynomials that can be obtained by scaling and adding the polynomials in P with
coefficients from the real numbers.
In simpler terms, it is the set of all polynomials that can be obtained by combining the
given polynomials using addition, subtraction, and scalar multiplication. The span of a set
of polynomials is always a subspace of the vector space of all polynomials. This means
that the span inherits the vector space properties, such as commutativity and associativity
, of addition, distributivity over scalar multiplication, and the existence of an additive
identity and additive inverse.
Linear Combinations:
A linear combination of a set of polynomials is a polynomial that is formed by adding
and subtracting multiples of the original polynomials. The coefficients of the linear
combination can be any real or complex numbers.
A linear combination of polynomials
( p1 ( x ) , p 2( x ) , … , p n (x ) ) is given by ( a1 ̇ p1 ( x ) +a2 ̇ p 2( x )+ …+an ̇ p n ( x ) ) ), where ( a 1 , a 2 , … , c n ) are
real numbers.
In simpler terms, a linear combination of polynomials is an expression constructed by
multiplying each polynomial by a constant and adding the results.
Linear Independence:
A set of polynomials is said to be linearly independent if no polynomial in the set can be
expressed as a linear combination of the other polynomials in the set. In other words, no
polynomial in the set is redundant, and every polynomial in the set contributes uniquely
to the span of the set
Hence, for a set of polynomials { p1 ( x ) , p 2( x ) , … , p n (x ) }, if the equation
( a1 . p1 ( x ) +a2 . p2 ( x ) +…+ an . p n( x )=0 ) holds only when (a 1=a2 =…=an=0), then the set is
linearly independent.
To determine whether a set of polynomials is linearly independent, we can associate each
polynomial to a vector by taking its leading coefficients. The vectors obtained in this way
are the columns of a matrix. If the determinant of this matrix is nonzero, then the set of
polynomials is linearly independent. Otherwise, the set is linearly dependent.
3. INNER PRODUCT AND NORM
The first proposed inner product, defined as multiplying the coefficients of the same
degree and adding them, satisfies the requirements for an inner product.
Verifying the requirements:
Requirement a: ⟨ 0 , x ⟩=⟨ 0,0 ⟩=0
1. INTRODUCTION
Linear algebra provides a foundation for understanding vectors in various contexts,
extending beyond traditional numeric representations. In this project, we will explore
vectors and vector spaces in a broader context, emphasizing the diverse structures
applicable. We will apply Polya's problem-solving method and utilize advanced
calculators when necessary.
2. VECTOR SPACE
Polynomials constitute a vector space over the real numbers, satisfying the fundamental
axioms of a vector space:
a. Additive Identity: The zero polynomial, denoted as 0( x)=0 for all values of
x , acts as the additive identity. For any polynomial p(x ), adding the zero
polynomial results in the original polynomial:
0( x)+ p(x)=p ( x)
b. Closed Under Addition: Adding two polynomials results in another
polynomial. Let p( x )∧q ( x) be any two polynomials. Their sum, p(x )+ q( x ),
is also a polynomial.
c. Closed Under Multiplication: Multiplying two polynomials p( x )∧q ( x)
results in another polynomial.
d. Unitary Element: The scalar constant 1 acts as the unitary element.
Multiplying a polynomial p(x ) by 1 results in the original polynomial:
1 ∗ p( x )= p(x )
e. Additive Inverse: For every polynomial p(x ), there exists an additive
inverse, denoted by − p( x), such that adding them results in the zero
polynomial:
, p(x )+(− p (x))=0(x )
f. Commutativity under Addition: Adding polynomials is commutative,
meaning the order of addition doesn't change the result:
p( x )+ q(x )=q ( x)+ p( x )
g. Associativity under Addition and Scalar Multiplication: Addition and
scalar multiplication of polynomials are associative since for any polynomial
p( x ), q ( x)∧r (x):
( p( x )+ q(x ))+r (x)= p( x)+(q(x )+ r (x))
And
a ∗(b ∗ p ( x))=(a ∗b)∗ p(x )
h. Distributivity: For any polynomials p( x ), q ( x) ,∧r ( x):
a ⋅(p ( x)+q ( x))=a⋅ p (x)+a ⋅ q ( x)
And
(a+ b)⋅ p( x )=a⋅ p( x)+b ⋅ p( x )
Span of Polynomials:
The span of a set of polynomials S, denoted as span(S), is the smallest subspace of a set
of polynomials that contains all the polynomials in S. It is the collection of all
polynomials that can be formed by adding and subtracting multiples of the original
polynomials. Hence it represents the set of all possible linear combinations of those
polynomials.
If P= { p 1 (x ) , p2 ( x ) ,… , pn ( x ) }is a set of polynomials, then the span of P is the set of all
polynomials that can be obtained by scaling and adding the polynomials in P with
coefficients from the real numbers.
In simpler terms, it is the set of all polynomials that can be obtained by combining the
given polynomials using addition, subtraction, and scalar multiplication. The span of a set
of polynomials is always a subspace of the vector space of all polynomials. This means
that the span inherits the vector space properties, such as commutativity and associativity
, of addition, distributivity over scalar multiplication, and the existence of an additive
identity and additive inverse.
Linear Combinations:
A linear combination of a set of polynomials is a polynomial that is formed by adding
and subtracting multiples of the original polynomials. The coefficients of the linear
combination can be any real or complex numbers.
A linear combination of polynomials
( p1 ( x ) , p 2( x ) , … , p n (x ) ) is given by ( a1 ̇ p1 ( x ) +a2 ̇ p 2( x )+ …+an ̇ p n ( x ) ) ), where ( a 1 , a 2 , … , c n ) are
real numbers.
In simpler terms, a linear combination of polynomials is an expression constructed by
multiplying each polynomial by a constant and adding the results.
Linear Independence:
A set of polynomials is said to be linearly independent if no polynomial in the set can be
expressed as a linear combination of the other polynomials in the set. In other words, no
polynomial in the set is redundant, and every polynomial in the set contributes uniquely
to the span of the set
Hence, for a set of polynomials { p1 ( x ) , p 2( x ) , … , p n (x ) }, if the equation
( a1 . p1 ( x ) +a2 . p2 ( x ) +…+ an . p n( x )=0 ) holds only when (a 1=a2 =…=an=0), then the set is
linearly independent.
To determine whether a set of polynomials is linearly independent, we can associate each
polynomial to a vector by taking its leading coefficients. The vectors obtained in this way
are the columns of a matrix. If the determinant of this matrix is nonzero, then the set of
polynomials is linearly independent. Otherwise, the set is linearly dependent.
3. INNER PRODUCT AND NORM
The first proposed inner product, defined as multiplying the coefficients of the same
degree and adding them, satisfies the requirements for an inner product.
Verifying the requirements:
Requirement a: ⟨ 0 , x ⟩=⟨ 0,0 ⟩=0