Simulation and the Monte Carlo Method 2nd Edition
By
Dirk P. Kroese,
Thomas Taimre,
Zdravko I. Botev,
Rueven Y. Rubinstein
( All Chapters Included - 100% Verified Solutions )
1
,CHAPTER 1
PRELIMINARIES
Probability Theory
1.1 Using the properties of the probability measure in Definition 1.1.1:
A probability P is a rule that assigns a number 0 < Ψ(Α) < 1 to each event A, such
that Ρ(Ω) = l t and such that for any sequence Ai, A2,... of disjoint events
prove the following results.
(a) F(Ac) = l-F(A).
(b) Ψ(Α U B) = P(i4) + P(JB) - P(i4 Π B).
1.2 Prove the product rule (1.4):
For any sequence of events A\, A2,..., An,
Ψ(Αι · · · Λ„) = ?(Αι) Ψ(Α2 | Αι) Ψ(Α3 | Λι A7) · · · Ρ(Λη \AX · · · Λ η -ι) ,
using the abbreviation Λ1Λ2 · · - Ak = Ai Π Λ2 Π · · · O Λ*.
for the case of three events.
Solutions Manual for SMCM, 2nd Edition. By D.P. Kroese, T. Taimre, Z.I. Botev, and R.Y. Rubinstein 1
Copyright © 2007 John Wiley & Sons, Inc.
2
,2 PRELIMINARIES
1.3 We draw three balls consecutively from a bowl containing exactly five white and five
black balls, without putting them back. What is the probability that all drawn balls will be
black?
1.4 Consider the random experiment where we toss a biased coin until heads comes up.
Suppose the probability of heads on any one toss is p. Let X be the number of tosses
required. Show that X ~ G(p).
1.5 In a room with many people, we ask each person his/her birthday, for example, May
5. Let N be the number of people queried until we get a "duplicate" birthday.
(a) Calculate F(N > π), π = 0 , 1 , 2 , . . . .
(b) For which n do we have F(N ζ η) 3* 1/2?
(c) Use a computer to calculate E[N].
1.6 Let X and Y be independent standard normal random variables, and let U and V be
random variables that are derived from X and Y via the linear transformation
/£7\ _ /sin a —cosa\ fX\
\VJ ~~ \cosa sin a ) \Y)
(a) Derive the joint pdf of U and V.
(b) Show that U and V are independent and standard normally distributed.
1.7 Let X ~ Εχρ(λ). Show that the memoryless property holds:
¥{X>t + s\X>t) = ¥(X>s) foralls,t>0.
1.8 Let X\, Xi, X$ be independent Bernoulli random variables with success probabilities
1/2,1/3, and 1/4, respectively. Give their conditional joint pdf, given that X\ +X2 +Xz =
2.
1.9 Verify the expectations and variances in Table 1.1 below.
Table 1.1 Expectations and variances for some well-known distributions.
Dist. E[X] Var(X) Dist. E[X] Var(X)
Βίη(π,ρ) ηρ np(l - p) Gamma(α, λ) - ^
Χ
-ψ- Ν(μ,σ 2 ) μ σ*
Ροί(λ) λ X Beta(a,/3) ^ (o+wt+o+ti
i^f- Weib(a,A) m£l Sßfsl - (Eü^)2
Εχρ(λ) I
3
, 3
1.10 Let X and Y have joint density / given by
f(x,y) = cxy, Οζρζχ, 0 < ar < 1 .
(a) Determine the normalization constant c.
(b) Determine Ψ(Χ 4- 2 y < 1).
1.11 Let X ~ Εχρ(λ) and Y ~ Exp(/z) be independent. Show that
(a) min(X,y)~Exp(A + M),
(b) ¥(X < Y | min(X, y ) ) = λ + μ'
1.12 Verify the properties of variance and covariance in Table 1.2 below.
Table 1.2 Properties of variance and covariance.
1 Var(X) = E[X 2 ] - (E[X])2 ¡
2
2 Var(aX 4- b) = a Var(X)
3Cov(x, y) = E[xy] - E[X) E[y] 1
4 cov(x,y) = cov(y,x)
5 Cov(aX + bY, Z) = a Cov(*, Z) 4- 6 Cov(y, Z)
6 Cov(X,X) = Var(X)
7 Var(X 4- Y) = Var(X) 4- Var(y) + 2Cov(X, y )
8 X and y indep. = * Cov(X, y ) = 0
1.13 Show that the correlation coefficient always lies between - 1 and 1. (Hint: use the
fact that the variance of aX -f Y is always nonnegative, for any a.)
1.14 Consider Examples 1.1-1.2. Define X as the function that assigns the number
Xi H l· x n to each outcome ω — {x\,..., xn). The event that there are exactly k heads
in n throws can be written as
{ω € Ω : Χ(ω) = fc} .
If we abbreviate this to {X = fc} and further abbreviate ¥({X = k}) to F(X = /:), then
we obtain exactly (1.7). Verify that one can always view random variables in this way,
that is, as real-valued functions on Ω, and that probabilities such as F(X < x) should be
interpreted as Ψ{{ω € Ω : Χ(ω) ^ x}).
1.15 Show that
Var x Var 2 Cov X
Σ
t=l
* = Σ
i=l
(^)+ Σ ( * - Ö)
i<j
1.16 Let Σ be the covariance matrix of a random column vector X. Write Y = X — μ,
where μ is the expectation vector of X. Hence, Σ = E [ Y Y r ] . Show that Σ is positive
semidefinite. That is, for any vector u, we have u r E u ^ 0.
4