• This is the cheat sheet allowed during ASU’s CSE575 Statistical
Machine Learning Exam!
• It covers the summary of the courses’s contents starting with
week4 Graphical Models and including week 7’s Neural Network
Deep Learning.
• The first 5 pages are the summary of the cheat sheet that you
can read at full scale
• The last 2 pages are the actual cheat sheet that you can print
containing the summary shrinked to fit 2 pages (or 1 page both
sides).
• It is advisable for you to highlight the underlined titles on the
printed cheat sheet with a marker for better visibility during
exam.
• Good luck on your exam !
, W4.Graphical Models
Bayesian Networks(Nayes Nets):DAG, nodes= random variables, di-
rected edges represent immediate dependence of nodes
-model parameters are probabilities
-tree-structured BN: belief propagation used for inference problems
-generalize above method ⇒ junction tree algorithm
-more often: approximation methods (variational m., Sampling-Monte Carlo)
-learning the probabilities(Expectation-Maximization(EM) algorithm):
relative frequency for estimating probability, prior = typically assumed,
MLE principle
Hidden Markov Models(HMM)=dynamic BN(modeling a process
indexed by time) -first order Markov chain: P (st = Sj |st−1 = Si )
HMM ∧ = {θ, Ω, A, B, π},
θ = set of hidden states, Ω = set of outputs(observations)
aij = P (st = Sj |st−1 = Si ), A = {aij } :transition probability matrix
observation probability P (ot |st )= the emission probab.(observation at
time t given state st )
B = emission probab. matrix=bjk = P (ot = vk |st = Sj ) where vk = kth
symbol in Ω
Π = initial state distribution={πi }, πi = P (s1 = Si ) : system starts from
state Si with probability P
... read more on the next page