ISYE-6644 SIMULATION QUESTIONS & ANSWERS
(8.3) Find the sample variance of -3, -2, -1, 0, 1, 2, 3 - Answers -14/3 (or 4.666). If
sample is entire population than variance is 4.
(8.1) M/M/1 queue - Answers -queue length having a single server.
(8.3) If the expected value of your estimator equals the parameter that you're trying to
estimate, then your estimator is unbiased. True of False - Answers -True. This is the
definition of unbiasedness
(8.3) If X1, X2, ..., Xn are i.i.d. with mean mu, then the sample mean X-bar is unbiased
for mu. True or False - Answers -True.
(8.4) What is the MSE (Mean Squared Error) of an estimator? - Answers -Bias^2 +
Variance
(8.3) What is the expected value of the mean of a Pois(λ) random variable? - Answers -
λ is the mean and the variance
(8.3) What is the expected sample variance s^2 of a Pois(λ) random variable? -
Answers -λ is the sample variance and the mean
(8.4) Suppose that estimator A has bias = 3 and variance = 12, while estimator B has
bias -2 and variance = 14. Which estimator (A or B) has the lower mean squared error?
- Answers -B is lower. Bias^2 + Variance: 18 < 21
MLE - Answers -Maximum Likelihood Estimator - "A method of estimating the
parameters of a distribution by maximizing a likelihood function, so that under the
assumed statistical model the observed data is most probable."
(8.4) Suppose that X1=4, X2=3, X3=5 are i.i.d. realizations from an Exp(λ) distribution.
What is the MLE of λ? - Answers -0.25
(8.5/8.6) If X1=2, X2=−2, and X3=0 are i.i.d. realizations from a Nor(μ , σ^2) distribution,
what is the value of the maximum likelihood estimate for the variance σ^2? - Answers -
8/3. MLE of σ^2 is the summation of the squared differences (Xi - μ), all divided by n.
(8.5/8.6) Suppose we observe the Pois(λ) realizations X1=5, X2=9 and X3=1. What is
the maximum likelihood estimate of λ? - Answers -5. λ is estimated as the summation
of sample values divided by the number of sample values. (5+9+1)/3 = 5
(8.5) Suppose X1, ..., Xn are i.i.d. Bern(p). Find the MLE for p. - Answers -
, (8.7) Suppose that we have a number of observations from a Pois(λ) distribution, and it
turns out that the MLE for λ is λhat=5. What's the maximum likelihood estimate of
Pr(X=3)? - Answers -0.1404. P(X=x) = λ^x * e^(−λ) / x!
(8.6) TRUE or FALSE? It's possible to estimate two MLEs simultaneously, e.g., for the
Nor(μ,σ2) distribution. - Answers -True
(8.6) TRUE or FALSE? Sometimes it might be difficult to obtain an MLE in closed form.
- Answers -True. (There is a gamma example.)
(8.7) Suppose that the MLE for a parameter θ is θhat=4. Find the MLE for √θ. -
Answers -2. Invariance immediately implies that the MLE of √θ is simply √θhat = 2
(8.8) Suppose that we observe X1 = 5, X2 = 9, and X3 = 1. What's the method of
moments estimate of E[X^2]? - Answers -35.6667. Second moment is the sum of the
squared samples divided by the number of samples. (5^2 + 9^2 + 1^2) / 3 =
35.666666667
(8.9) Suppose we're conducting a χ^2 goodness-of-fit test with Type I error rate α = 0.01
to determine whether or not 100 i.i.d. observations are from a lognormal distribution with
unknown parameters μ and σ^2. If we divide the observations into 5 equal-probability
intervals and we observe a g-o-f statistic of χ0^2 = 11.2, will we ACCEPT (i.e., fail to
reject) or REJECT the null hypothesis of lognormality? - Answers -Reject. k = 5,
subtract 1 and subtract 2 for the two unknown parameters (or had to estimate), so
degrees of freedom is 2. critical value for dof 2 and alpha 0.01 is 9.21. 11.2 is not
smaller than 9.21 so we reject it. Not a good fit.
(8.9) Suppose H0 is true, but you've just rejected it! What have you done? - Answers -
Type I error
(8.10/8.11) The test statistic is χ0^2 = 9.12. Now, let's use our old friend α = 0.05 in our
test. Let k = 4 denote the number of cells (that we ultimately ended up with) and let s =
1 denote the number of parameters we had to estimate. Then we compare against
χ^2(α=0.05 , k − s − 1) = χ^2(α=0.05 , 2) = 5.99. Do we ACCEPT (i.e., fail to reject) or
REJECT the Geometric hypothesis? - Answers -Reject. The test statistic 9.12 is not
less than 5.99.
(8.12) Consider the PRN's U1 = 0.1 , U2 = 0.9 , and U3 = 0.2. Use Kolmogorov-Smirnov
with α = 0.05 to test to see if these numbers are indeed uniform. Do we ACCEPT (i.e.,
fail to reject) or REJECT uniformity? - Answers -Accept. From table, D(α=0.05, 3) =
0.70760. Create ordered sample set: 0.1, 0.2, 0.9. Since the max value of D test is
0.467, then we fail to reject because it is smaller.
(9.1) TRUE or FALSE? Simulation output (e.g., consecutive customer waiting times) is
almost never i.i.d. normal - and that's a big fat problem! - Answers -True
(8.3) Find the sample variance of -3, -2, -1, 0, 1, 2, 3 - Answers -14/3 (or 4.666). If
sample is entire population than variance is 4.
(8.1) M/M/1 queue - Answers -queue length having a single server.
(8.3) If the expected value of your estimator equals the parameter that you're trying to
estimate, then your estimator is unbiased. True of False - Answers -True. This is the
definition of unbiasedness
(8.3) If X1, X2, ..., Xn are i.i.d. with mean mu, then the sample mean X-bar is unbiased
for mu. True or False - Answers -True.
(8.4) What is the MSE (Mean Squared Error) of an estimator? - Answers -Bias^2 +
Variance
(8.3) What is the expected value of the mean of a Pois(λ) random variable? - Answers -
λ is the mean and the variance
(8.3) What is the expected sample variance s^2 of a Pois(λ) random variable? -
Answers -λ is the sample variance and the mean
(8.4) Suppose that estimator A has bias = 3 and variance = 12, while estimator B has
bias -2 and variance = 14. Which estimator (A or B) has the lower mean squared error?
- Answers -B is lower. Bias^2 + Variance: 18 < 21
MLE - Answers -Maximum Likelihood Estimator - "A method of estimating the
parameters of a distribution by maximizing a likelihood function, so that under the
assumed statistical model the observed data is most probable."
(8.4) Suppose that X1=4, X2=3, X3=5 are i.i.d. realizations from an Exp(λ) distribution.
What is the MLE of λ? - Answers -0.25
(8.5/8.6) If X1=2, X2=−2, and X3=0 are i.i.d. realizations from a Nor(μ , σ^2) distribution,
what is the value of the maximum likelihood estimate for the variance σ^2? - Answers -
8/3. MLE of σ^2 is the summation of the squared differences (Xi - μ), all divided by n.
(8.5/8.6) Suppose we observe the Pois(λ) realizations X1=5, X2=9 and X3=1. What is
the maximum likelihood estimate of λ? - Answers -5. λ is estimated as the summation
of sample values divided by the number of sample values. (5+9+1)/3 = 5
(8.5) Suppose X1, ..., Xn are i.i.d. Bern(p). Find the MLE for p. - Answers -
, (8.7) Suppose that we have a number of observations from a Pois(λ) distribution, and it
turns out that the MLE for λ is λhat=5. What's the maximum likelihood estimate of
Pr(X=3)? - Answers -0.1404. P(X=x) = λ^x * e^(−λ) / x!
(8.6) TRUE or FALSE? It's possible to estimate two MLEs simultaneously, e.g., for the
Nor(μ,σ2) distribution. - Answers -True
(8.6) TRUE or FALSE? Sometimes it might be difficult to obtain an MLE in closed form.
- Answers -True. (There is a gamma example.)
(8.7) Suppose that the MLE for a parameter θ is θhat=4. Find the MLE for √θ. -
Answers -2. Invariance immediately implies that the MLE of √θ is simply √θhat = 2
(8.8) Suppose that we observe X1 = 5, X2 = 9, and X3 = 1. What's the method of
moments estimate of E[X^2]? - Answers -35.6667. Second moment is the sum of the
squared samples divided by the number of samples. (5^2 + 9^2 + 1^2) / 3 =
35.666666667
(8.9) Suppose we're conducting a χ^2 goodness-of-fit test with Type I error rate α = 0.01
to determine whether or not 100 i.i.d. observations are from a lognormal distribution with
unknown parameters μ and σ^2. If we divide the observations into 5 equal-probability
intervals and we observe a g-o-f statistic of χ0^2 = 11.2, will we ACCEPT (i.e., fail to
reject) or REJECT the null hypothesis of lognormality? - Answers -Reject. k = 5,
subtract 1 and subtract 2 for the two unknown parameters (or had to estimate), so
degrees of freedom is 2. critical value for dof 2 and alpha 0.01 is 9.21. 11.2 is not
smaller than 9.21 so we reject it. Not a good fit.
(8.9) Suppose H0 is true, but you've just rejected it! What have you done? - Answers -
Type I error
(8.10/8.11) The test statistic is χ0^2 = 9.12. Now, let's use our old friend α = 0.05 in our
test. Let k = 4 denote the number of cells (that we ultimately ended up with) and let s =
1 denote the number of parameters we had to estimate. Then we compare against
χ^2(α=0.05 , k − s − 1) = χ^2(α=0.05 , 2) = 5.99. Do we ACCEPT (i.e., fail to reject) or
REJECT the Geometric hypothesis? - Answers -Reject. The test statistic 9.12 is not
less than 5.99.
(8.12) Consider the PRN's U1 = 0.1 , U2 = 0.9 , and U3 = 0.2. Use Kolmogorov-Smirnov
with α = 0.05 to test to see if these numbers are indeed uniform. Do we ACCEPT (i.e.,
fail to reject) or REJECT uniformity? - Answers -Accept. From table, D(α=0.05, 3) =
0.70760. Create ordered sample set: 0.1, 0.2, 0.9. Since the max value of D test is
0.467, then we fail to reject because it is smaller.
(9.1) TRUE or FALSE? Simulation output (e.g., consecutive customer waiting times) is
almost never i.i.d. normal - and that's a big fat problem! - Answers -True