Deep Learning
Created @October 30, 2021 4:35 PM
Class S2
Type
Materials
Lecture 1
Introduction
Foundations of Deep Learning
Probabilistic Learning
e.g. Toss a coin
Bernoulli distribution
Deep Learning 1
, likelihood function in this case: product, because independent and identical from same coin, one probability
Objective function: .. all identically distributed so all follow Bernoulli
finding best x:
formulating it as general approach, probabilistic learning (likelihood-based) —> always done like this
p(y|x) is probability distribution
Deep Learning 2
, probabilistic values bernoulli can be used, real or continues values can be done with gaussian or
others
identically and independently —> iid, else if sequential you are dependent of past so a method with
sequential (generative regressive models, rnn)
Logistic Regression
= model that allows us to classify objects
linear dependency: when the result of coin depends on the hand velocity while coining, or the mail calling
spam depends on the words. Its probabilistic so we need sigmoid
Deep Learning 3
, gradient = vector of derivatives
note for image: derivative with respect to theta applying the chain rule
we get the difference between our prediction (sigmoid, y?) and the label x which comes into the gradient. If
our model is correct, the weights θ will not change, else we will follow the direction of x (-1)
next image not important to know in very detail:
Deep Learning 4