100% Zufriedenheitsgarantie Sofort verfügbar nach Zahlung Sowohl online als auch als PDF Du bist an nichts gebunden 4.2 TrustPilot
logo-home
Prüfung

CS 189/289A Introduction to Machine Learning HW2 Solutions University of California, Berkeley COMPSCI 189

Bewertung
-
Verkauft
1
seiten
19
Klasse
A+
Hochgeladen auf
06-02-2023
geschrieben in
2022/2023

CS 189/289A Introduction to Machine Learning Spring 2021 Jonathan Shewchuk HW2: I r Math Due Wednesday, February 10 at 11:59 pm • Homework 2 is an entirely written assignment; no coding involved. • We prefer that you typeset your answers using LATEX or other word processing software. If you haven’t yet learned LATEX, one of the crown jewels of computer science, now is a good time! Neatly handwritten and scanned solutions will also be accepted. • In all of the questions, show your work, not just the final answer. • Start early. This is a long assignment. Most of the material is prerequisite material not covered in lecture; you are responsible for finding resources to understand it. Deliverables: 1. Submit a PDF of your homework to the Gradescope assignment entitled “HW2 Write-Up”. You may typeset your homework in LATEX or Word (submit PDF format, not .doc/.docx format) or submit neatly handwritten and scanned solutions. Please start each question on a new page. If there are graphs, include those graphs in the correct sections. Do not put them in an appendix. We need each solution to be self-contained on pages of its own. • In your write-up, please state whom you had discussions with (not counting course staff) about the homework contents. • In your write-up, please copy the following statement and sign your signature next to it. (Mac Preview and FoxIt PDF Reader, among others, have tools to let you sign a PDF file.) We want to make it extra clear so that no one inadvertently cheats. “I certify that all solutions are entirely in my own words and that I have not looked at another student’s solutions.

Mehr anzeigen Weniger lesen
Hochschule
Kurs










Ups! Dein Dokument kann gerade nicht geladen werden. Versuch es erneut oder kontaktiere den Support.

Schule, Studium & Fach

Kurs

Dokument Information

Hochgeladen auf
6. februar 2023
Anzahl der Seiten
19
geschrieben in
2022/2023
Typ
Prüfung
Enthält
Fragen & Antworten

Themen

Inhaltsvorschau

CS 189/289A Introduction to Machine Learning
Spring 2021 Jonathan Shewchuk HW2: I r Math
Due Wednesday, February 10 at 11:59 pm

• Homework 2 is an entirely written assignment; no coding involved.
• We prefer that you typeset your answers using LATEX or other word processing software. If
you haven’t yet learned LATEX, one of the crown jewels of computer science, now is a good
time! Neatly handwritten and scanned solutions will also be accepted.

• In all of the questions, show your work, not just the final answer.
• Start early. This is a long assignment. Most of the material is prerequisite material not
covered in lecture; you are responsible for finding resources to understand it.

Deliverables:

1. Submit a PDF of your homework to the Gradescope assignment entitled “HW2 Write-Up”.
You may typeset your homework in LATEX or Word (submit PDF format, not .doc/.docx
format) or submit neatly handwritten and scanned solutions. Please start each question on
a new page. If there are graphs, include those graphs in the correct sections. Do not put
them in an appendix. We need each solution to be self-contained on pages of its own.
• In your write-up, please state whom you had discussions with (not counting course staff)
about the homework contents.
• In your write-up, please copy the following statement and sign your signature next to it.
(Mac Preview and FoxIt PDF Reader, among others, have tools to let you sign a PDF
file.) We want to make it extra clear so that no one inadvertently cheats.
“I certify that all solutions are entirely in my own words and that I have not looked at
another student’s solutions. I have given credit to all external sources I consulted.”




HW2: I r Math, ©UCB CS 189/289A, Spring 2021. All Rights Reserved. This may not be publicly shared without explicit permission. 1

,1 Identities with Expectation
For this exercise, the following identity might be useful: for a probability event A, P(A) = E[1{A}],
where 1{·} is the indicator function.

1. Let X be a random variable with density f (x) = λe−λx 1{x > 0}. Show that E[X k ] = k!
λk
for
integer k ≥ 0. Hint: One way is to do induction on k.
2. For any non-negative random variable X and constant t > 0, show that P(X ≥ t) ≤ E[X]
t
.
Hint: show that for a, b > 0, 1{a ≥ b} ≤ ab .
3. For any non-negative random variable X, prove the identity
Z ∞
E[X] = P(X ≥ t)dt.
0

You may assume that X admits a density to simplify.
4. For any non-negative random variable X with finite variance (i.e., E[X 2 ] < ∞), prove that
(EX)2
P(X > 0) ≥ .
E[X 2 ]
Hint: Use the Cauchy–Schwarz inequality hu, vi2 ≤ hu, uihv, vi. You have most likely seen it
applied when the inner product is the real dot product; however, it holds for arbitrary inner
products. Without proof, use the fact that the expectation E[UV] is a valid inner product of
random variables U and V.
(Note that by assumption we know P(X ≥ 0) = 1, so this inequality is indeed quite powerful.)
5. For a random variable X with finite variance and E[X] = 0, prove that
E[X 2 ]
P(X ≥ t) ≤ for any t ≥ 0
E[X 2 ] + t2
Hint: Try using logic similar to Question 1.4 on t − X.

Solution:

1. Moment Generating Function. Calculate the MGF: MX (t) = E[etX ] = x>0 λe(t−λ)x dx =
R
1
1−(t/λ)
if t < λ and undefined otherwise. Since the MGF is defined in a neighborhood of 0
(specifically |t| < λ), all moments E[X k ] exist. Furthermore, from properties of the MGF,
E[X k ] 1
is the coefficient of tk . Expanding 1−(t/λ) as k≥0 λ1k tk completes the solution.
P
k!

R ∞ case: EX = 1. Inductive hypothesis: for k > 0, EX = λ EX . Inductive
0 k k k−1
Induction. Base
step: EX k = 0 λxk e−λx dx. Let u = xk and dv = λe−λx , so du = kxk−1 and v = −e−λx . Then
R∞ R∞ R∞
0
λxk e−λx dx = [−xk e−λx ]∞
0 + 0 kx e dx = 0 + λk 0 λxk−1 e−λx dx = λk EX k−1 , where the
k−1 −λx

last equality comes from the inductive hypothesis. So EX k = Πki=0 λi = λk!k . Note that the trick
of separating out the k (= kλλ
) factor in the second-to-last equality represents a generally useful

HW2: I r Math, ©UCB CS 189/289A, Spring 2021. All Rights Reserved. This may not be publicly shared without explicit permission. 2

, approach for solving problems: figure out what form you want the problem to “look like”
and try to transform it as close as possible to that form. Since we know we’re dealing with
k−1
induction, we know we would
R ∞ like to somehow obtain EX during the inductive step. By
our assumption, EX k−1 = 0 λxk−1 eλx dx. By keeping this in mind and paying close attention,
we realize we can move a constant λk outside the integral in the second to last equality, leaving
behind the needed λ factor.
[RUBRIC: There could be other ways to solve this. Any completely correct solution gets (+2
points). Any partially correct or incomplete solution gets (+1 point).]
2. When X ≥ t, 1{X ≥ t} = 1 ≤ Xt . On the other hand, when X < t, 1{X ≥ t} = 0 ≤ X
t
since X is
non-negative. Take expectations on both sides to complete.
[RUBRIC: A completely correct solution gets (+1 point).]

3. First see that X = t≥0 1{X ≥ t}dt. Take expectation and use linearity of expectation: E[X] =
R
R  R R∞
E t≥0 1{X ≥ t}dt = t≥0 E[1{X ≥ t}]dt = 0 P(X ≥ t)dt. Note that X need not have a density
function for this solution.
Assuming density f (x):
"Z ∞ # Z ∞Z ∞ Z ∞Z ∞
E[X] = E 1{X ≥ t}dt = 1{x ≥ t}dt f (x)dx = 1{x ≥ t} f (x)dx dt
0 0 0 0 0
Z ∞
= P(X ≥ t)dt.
0

Again assuming density f (x):
Z ∞ Z ∞ Z x Z ∞ Z ∞ Z ∞
E[X] = x f (x)dx = f (x)dtdx = f (x)dxdt = P(X ≥ t)dt
0 0 0 0 t 0

[RUBRIC: A completely correct solution gets (+1 point).]
4. Using the non-negativity of X, we have EX = E[X1{X > 0}]. [RUBRIC: (+1 point)]
Now use Cauchy–Schwarz applied to U := X and V := 1{X > 0} to conclude that

(EX)2 = (E[X1{X > 0}])2 ≤ E[X 2 ]E[1{X > 0}2 ] = E[X 2 ]E[1{X > 0}] = E[X 2 ]P(X > 0).

[RUBRIC: Correct application of Cauchy–Schwarz Inequality gets (+1 point).]
[RUBRIC: Total (+2 points).]

5. Using the same idea as in the previous part,

E[t − X] ≤ E[(t − X)1{t − X > 0}] = E[(t − X)1{X < t}].

[RUBRIC: using indicators correctly and arriving at E[t − X] ≤ E[(t − X)1{X < t}] gets (+1
point).]


HW2: I r Math, ©UCB CS 189/289A, Spring 2021. All Rights Reserved. This may not be publicly shared without explicit permission. 3

Lerne den Verkäufer kennen

Seller avatar
Bewertungen des Ansehens basieren auf der Anzahl der Dokumente, die ein Verkäufer gegen eine Gebühr verkauft hat, und den Bewertungen, die er für diese Dokumente erhalten hat. Es gibt drei Stufen: Bronze, Silber und Gold. Je besser das Ansehen eines Verkäufers ist, desto mehr kannst du dich auf die Qualität der Arbeiten verlassen.
ExamsConnoisseur Self
Folgen Sie müssen sich einloggen, um Studenten oder Kursen zu folgen.
Verkauft
566
Mitglied seit
2 Jahren
Anzahl der Follower
343
Dokumente
1497
Zuletzt verkauft
1 Jahren vor

4.3

67 rezensionen

5
40
4
11
3
12
2
1
1
3

Kürzlich von dir angesehen.

Warum sich Studierende für Stuvia entscheiden

on Mitstudent*innen erstellt, durch Bewertungen verifiziert

Geschrieben von Student*innen, die bestanden haben und bewertet von anderen, die diese Studiendokumente verwendet haben.

Nicht zufrieden? Wähle ein anderes Dokument

Kein Problem! Du kannst direkt ein anderes Dokument wählen, das besser zu dem passt, was du suchst.

Bezahle wie du möchtest, fange sofort an zu lernen

Kein Abonnement, keine Verpflichtungen. Bezahle wie gewohnt per Kreditkarte oder Sofort und lade dein PDF-Dokument sofort herunter.

Student with book image

“Gekauft, heruntergeladen und bestanden. So einfach kann es sein.”

Alisha Student

Häufig gestellte Fragen