EC124 Notes
Probability: Venn Diagrams: Experiment who’s outcome is random and that experiment can
be repeated. Sample space omega is the set of all possible outcomes from the experiment.
Can be drawn as a rectangle. Ci is all the possible outcomes from the experiment
(elementary events), they are mutually exclusive – if one events occurred the other cannot
occur at the same time and they are said to be exhaustive meaning that if you undertake an
experiment one of those elementary events must occur.
You can show this by splitting up the sample space.
Event A is made up of combining C1, C2, C3
This forms the basis of venn diagrams.
B is a subset of A means that B lies entirely within the space of A.
A union B is made up of all of A and all of B i.e. A or B
A intersection B is made up of all the things that are in A and B
A complement = not i.e. A compliment means anything that is not in A inside the
sample space.
Mutually exclusive means that the intersection of two events is equal to empty set
Exhaustive means that the union of all the events is equal to omega.
Union-intersection rule: says you can always construct some new event
A from the union of the intersection of A with all the elementary events
within the sample space.
A ∪ B=B ∪ A
A ∩ B=B ∩ A
A ∩ϕ =ϕ
A ∪ϕ= A
A ∪ ( B ∩C )=( A ∪ B)∩( A ∪ C)
A ∩ B∪C= ( A ∩ B ) ∪ ( A ∩C )
( A ∩ B)= A ∪ B
( A ∪ B)=A ∩B
Simple Probability: P(A) = probability of A occurring 0 ≤ P (A )≤ 1
C 1 ∩C 2=ϕ for i≠ j→ P ( C 1 ∪ C 2 …C k ) =P ( C 1 ) + P ( C2 ) + … P(C k )
P ( C1 ∪ C2 … C k ) =P ( Ω )=1
P ( A )=1−P( A )
A ∩ A=ϕ → P ( A ∩ A )=0
P(A ∩ A ¿ = P(A) + P ( A )=P ( Ω ) =1
P(ϕ ¿=0
P ( A ∪ B ) =P ( A )+ P ( B )−P ( A ∩B)
P ( A ∪ B ∪C )=P ( A ) + P ( B ) + P ( C )−P ( A ∩ B ) −P ( B ∩C )−P ( A ∩ C ) + P( A ∩ B ∩C)
Bivariate Probability: Finding probability of C1B irrespective of
outcome of experiment A is found by adding up all the values in the
first column
Sum of all the probabilities have to be equal to 1
B A
B A P (C 2 ∩C 1 )
Conditional probabilities: ( 2| 1 )
P C C = A
P(C 1 )
, B A
B A P (C2 ,C 1 )
P ( C |C ) =
2 2 A
P(C 2 )
P ( C 2 ∩C1 ) =P ( C2B|C1A ) . P(C1A )
B A
A B
P ( C B2|C1A ) . P(C n1 )
P ( C |C ) =
1 2 B
P (C2 )
A A B A B A B
P(C i ¿=P ( C i ∩C i ) + P ( C i ∩C 2 ) + …+ P(C i ∩C k )
A B
A A B B P(C i ∨C2 )
P(C i ¿=P ( C i |C i ) . P ( C i ) + B
P(C 2 )
A B B
P ( Ci |C j ) . P ( C j )
B A
P ( C j ∨C i ) = k P (CiA ∩C Bj )
Bays theorem therefore: ≈ A
∑ P ( C Ai |C Bj ) . P(C Bj ) P (C i )
j=1
A B
Statistical independence: Events C ∧C are independent if:
i J
A B
A B P (Ci ∩C j )
P ( C i |C j ) = B
=P ( C iA )
P (C j )
P ( C iA ∩CiB ) =P ( CiA ) . P(C Bj )
Note how statistical independence is not the same as mutually exclusive.
P( A ∩ B ∩C)
Conditional Probability with 3 variables: P ( A|B∩ C )=
P (B ∩C )
P ( A ∩B )+ P ( A ∩C )−P ( A ∩B ∩C )
P ( A|B∩ C )=
P ( B )+ P ( C )−P( B ∩C)
P ( A ∩ B ∩C )=P ( A|B ∩ C ) . P ( B|C ) . P(C )
5! n! 5!
Permutations and Combinations: 5P2 = = = =20
( 5−2 ) ! ( n−r ) ! 3 !
5! n!
5
C2 =5P2/2! = = =10
( 5−2 ) ! ( 2 ) ! r ! ( n−r ) !
Permutations the order matters, combination the order does not matter that is why with
combinations you get a lower result
If events are independent, P(A) given P(B) equals just P(A) as it does not affect A whether
P(B) has occurred.
If events are mutually exclusive, P(A) given P(B) equals zero as they are mutually exclusive. If
A occurs it excludes the probability of B occurring.
, Univariate and Bivariate Distributions: For a random experiment, with a sample space,
omega, a function X which assigns to each element of C, a real number X(C) = x, is called a
random variable.
Example: Toss two coins Ω={ HH , HT ,TH , TT } . Let X ( C )=Number of tails
x 0 1 2
P(X=x) ¼ ½ ¼
Discrete Univariate Distributions: p(x) ≥ 0 and ∑ p ( x )=1 ,then X is a discrete random
x
b
variable with probability density function. P ( a ≤ X ≤ b )=∑ f (x )
x=a
The cumulative distribution function of X is such that for each:
F ( x 0 ) =P ( X ≤ x 0 )= ∑ p (x)
x ≤ x0
lim F ( x )=0
x→−∞
lim F ( x )=0
x→ ∞
Pr ( a< X ≤ b )=F ( b )−F (a)
Measures of central tendency and dispersion (spread): Median: P(X ≤ x ¿ ≥ 1/2and
P(X ≥ x ¿ ≥ 1/2then the median if the variable X is x. Note there might be a case where the
median is not defined.
Mode: The value of x such that p(x) is maximised.
Mean (expectation): E(X) = ∑ p ( x ) x =μ x ∨ p1 x 1+ p2 x 2 … pn xn
x
k
2 2 2
Variance: V ( X )=∑ p i ( x i−μ ) =E ( X ) −E ( X )
i=1
E(X ) = p1 x + p2 x … p n x 2n
2 2
1
2
2
In general: E [ g ( X ) ] =∑ p j g (x j )
j
k
Rules of Expectations: E ( X ) =∑ pi x i
i=1
k k k k
E ( X + a )=∑ pi (x i +a ¿ )−∑ ( pi x i+ pi a ¿ )=∑ p i x i +a ∑ pi=E ( X )+ a ¿ ¿
i=1 i=1 i=1 i=1
k k
E ( aX ) =∑ pi ( a xi ) =a ∑ pi xi =aE( X)
i=1 i=1
Rules of Variances:
k k
2 2
V(a + X) = ∑ pi {( x i +a )−E ( X + a ) } =∑ p i [ x i−E ( X ) ] =Var ( X )
i=1 i=1
Probability: Venn Diagrams: Experiment who’s outcome is random and that experiment can
be repeated. Sample space omega is the set of all possible outcomes from the experiment.
Can be drawn as a rectangle. Ci is all the possible outcomes from the experiment
(elementary events), they are mutually exclusive – if one events occurred the other cannot
occur at the same time and they are said to be exhaustive meaning that if you undertake an
experiment one of those elementary events must occur.
You can show this by splitting up the sample space.
Event A is made up of combining C1, C2, C3
This forms the basis of venn diagrams.
B is a subset of A means that B lies entirely within the space of A.
A union B is made up of all of A and all of B i.e. A or B
A intersection B is made up of all the things that are in A and B
A complement = not i.e. A compliment means anything that is not in A inside the
sample space.
Mutually exclusive means that the intersection of two events is equal to empty set
Exhaustive means that the union of all the events is equal to omega.
Union-intersection rule: says you can always construct some new event
A from the union of the intersection of A with all the elementary events
within the sample space.
A ∪ B=B ∪ A
A ∩ B=B ∩ A
A ∩ϕ =ϕ
A ∪ϕ= A
A ∪ ( B ∩C )=( A ∪ B)∩( A ∪ C)
A ∩ B∪C= ( A ∩ B ) ∪ ( A ∩C )
( A ∩ B)= A ∪ B
( A ∪ B)=A ∩B
Simple Probability: P(A) = probability of A occurring 0 ≤ P (A )≤ 1
C 1 ∩C 2=ϕ for i≠ j→ P ( C 1 ∪ C 2 …C k ) =P ( C 1 ) + P ( C2 ) + … P(C k )
P ( C1 ∪ C2 … C k ) =P ( Ω )=1
P ( A )=1−P( A )
A ∩ A=ϕ → P ( A ∩ A )=0
P(A ∩ A ¿ = P(A) + P ( A )=P ( Ω ) =1
P(ϕ ¿=0
P ( A ∪ B ) =P ( A )+ P ( B )−P ( A ∩B)
P ( A ∪ B ∪C )=P ( A ) + P ( B ) + P ( C )−P ( A ∩ B ) −P ( B ∩C )−P ( A ∩ C ) + P( A ∩ B ∩C)
Bivariate Probability: Finding probability of C1B irrespective of
outcome of experiment A is found by adding up all the values in the
first column
Sum of all the probabilities have to be equal to 1
B A
B A P (C 2 ∩C 1 )
Conditional probabilities: ( 2| 1 )
P C C = A
P(C 1 )
, B A
B A P (C2 ,C 1 )
P ( C |C ) =
2 2 A
P(C 2 )
P ( C 2 ∩C1 ) =P ( C2B|C1A ) . P(C1A )
B A
A B
P ( C B2|C1A ) . P(C n1 )
P ( C |C ) =
1 2 B
P (C2 )
A A B A B A B
P(C i ¿=P ( C i ∩C i ) + P ( C i ∩C 2 ) + …+ P(C i ∩C k )
A B
A A B B P(C i ∨C2 )
P(C i ¿=P ( C i |C i ) . P ( C i ) + B
P(C 2 )
A B B
P ( Ci |C j ) . P ( C j )
B A
P ( C j ∨C i ) = k P (CiA ∩C Bj )
Bays theorem therefore: ≈ A
∑ P ( C Ai |C Bj ) . P(C Bj ) P (C i )
j=1
A B
Statistical independence: Events C ∧C are independent if:
i J
A B
A B P (Ci ∩C j )
P ( C i |C j ) = B
=P ( C iA )
P (C j )
P ( C iA ∩CiB ) =P ( CiA ) . P(C Bj )
Note how statistical independence is not the same as mutually exclusive.
P( A ∩ B ∩C)
Conditional Probability with 3 variables: P ( A|B∩ C )=
P (B ∩C )
P ( A ∩B )+ P ( A ∩C )−P ( A ∩B ∩C )
P ( A|B∩ C )=
P ( B )+ P ( C )−P( B ∩C)
P ( A ∩ B ∩C )=P ( A|B ∩ C ) . P ( B|C ) . P(C )
5! n! 5!
Permutations and Combinations: 5P2 = = = =20
( 5−2 ) ! ( n−r ) ! 3 !
5! n!
5
C2 =5P2/2! = = =10
( 5−2 ) ! ( 2 ) ! r ! ( n−r ) !
Permutations the order matters, combination the order does not matter that is why with
combinations you get a lower result
If events are independent, P(A) given P(B) equals just P(A) as it does not affect A whether
P(B) has occurred.
If events are mutually exclusive, P(A) given P(B) equals zero as they are mutually exclusive. If
A occurs it excludes the probability of B occurring.
, Univariate and Bivariate Distributions: For a random experiment, with a sample space,
omega, a function X which assigns to each element of C, a real number X(C) = x, is called a
random variable.
Example: Toss two coins Ω={ HH , HT ,TH , TT } . Let X ( C )=Number of tails
x 0 1 2
P(X=x) ¼ ½ ¼
Discrete Univariate Distributions: p(x) ≥ 0 and ∑ p ( x )=1 ,then X is a discrete random
x
b
variable with probability density function. P ( a ≤ X ≤ b )=∑ f (x )
x=a
The cumulative distribution function of X is such that for each:
F ( x 0 ) =P ( X ≤ x 0 )= ∑ p (x)
x ≤ x0
lim F ( x )=0
x→−∞
lim F ( x )=0
x→ ∞
Pr ( a< X ≤ b )=F ( b )−F (a)
Measures of central tendency and dispersion (spread): Median: P(X ≤ x ¿ ≥ 1/2and
P(X ≥ x ¿ ≥ 1/2then the median if the variable X is x. Note there might be a case where the
median is not defined.
Mode: The value of x such that p(x) is maximised.
Mean (expectation): E(X) = ∑ p ( x ) x =μ x ∨ p1 x 1+ p2 x 2 … pn xn
x
k
2 2 2
Variance: V ( X )=∑ p i ( x i−μ ) =E ( X ) −E ( X )
i=1
E(X ) = p1 x + p2 x … p n x 2n
2 2
1
2
2
In general: E [ g ( X ) ] =∑ p j g (x j )
j
k
Rules of Expectations: E ( X ) =∑ pi x i
i=1
k k k k
E ( X + a )=∑ pi (x i +a ¿ )−∑ ( pi x i+ pi a ¿ )=∑ p i x i +a ∑ pi=E ( X )+ a ¿ ¿
i=1 i=1 i=1 i=1
k k
E ( aX ) =∑ pi ( a xi ) =a ∑ pi xi =aE( X)
i=1 i=1
Rules of Variances:
k k
2 2
V(a + X) = ∑ pi {( x i +a )−E ( X + a ) } =∑ p i [ x i−E ( X ) ] =Var ( X )
i=1 i=1