ISYE 6414 FINAL EXAM REVIEW QUESTIONS WITH VERIFIED
SOLUTIONS
Least Square Elimination (LSE) cannot be applied to GLM models. -- Answer ✔✔ False - it is
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, applicable but does not use data distribution information fully. l, l, l, l, l, l, l, l,
In multiple linear regression with idd and equal variance, the least squares estimation of
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, regression coefficients are always unbiased. -- Answer ✔✔ True - the least squares l, l, l, l, l, l, l, l, l, l, l, l,
l, estimates are BLUE (Best Linear Unbiased Estimates) in multiple linear regression.
l, l, l, l, l, l, l, l, l, l,
Maximum Likelihood Estimation is not applicable for simple linear regression and multiple
l, l, l, l, l, l, l, l, l, l, l,
l, linear regression. -- Answer ✔✔ False - In SLR and MLR, the SLE and MLE are the same with
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, normal idd data. l, l,
The backward elimination requires a pre-set probability of type II error -- Answer ✔✔ False
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, - Type I error
l, l, l,
The first degree of freedom in the F distribution for any of the three procedures in stepwise
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, is always equal to one. -- Answer ✔✔ True
l, l, l, l, l, l, l, l,
MLE is used for the GLMs for handling complicated link function modeling in the X-Y
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, relationship. -- Answer ✔✔ True l, l, l, l,
In the GLMs the link function cannot be a nonlinear regression. -- Answer ✔✔ False - It can
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, be linear, nonlinear, or parametric
l, l, l, l,
, When the p-value of the slope estimate in the SLR is small the r-squared becomes smaller
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, too. -- Answer ✔✔ False - When P value is small, the model fits become more significant
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, and R squared become larger.
l, l, l, l,
In GLMs the main reason one does not use LSE to estimate model parameters is the
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, potential constrained in the parameters. -- Answer ✔✔ False - The potential constraint in
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the parameters of GLMs is handled by the link function.
l, l, l, l, l, l, l, l, l,
The R-squared and adjusted R-squared are not appropriate model comparisons for
l, l, l, l, l, l, l, l, l, l,
l, nonlinear regression but are for linear regression models. -- Answer ✔✔ TRUE - The
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, underlying assumption of R-squared calculations is that you are fitting a linear model.
l, l, l, l, l, l, l, l, l, l, l, l,
The decision in using ANOVA table for testing whether a model is significant depends on
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the normal distribution of the response variable -- Answer ✔✔ True
l, l, l, l, l, l, l, l, l, l,
When the data may not be normally distributed, AIC is more appropriate for variable
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, selection than adjusted R-squared -- Answer ✔✔ True l, l, l, l, l, l, l,
The slope of a linear regression equation is an example of a correlation coefficient. --
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, Answer ✔✔ False - the correlation coefficient is the r value. Will have the same + or - sign as
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the slope. l,
In multiple linear regression, as the value of R-squared increases, the relationship
l, l, l, l, l, l, l, l, l, l, l,
between predictors becomes stronger -- Answer ✔✔ False - r squared measures how
l, l, l, l, l, l, l, l, l, l, l, l,
l, much variability is explained by the model, NOT how strong the predictors are.
l, l, l, l, l, l, l, l, l, l, l, l,
When dealing with a multiple linear regression model, an adjusted R-squared can
l, l, l, l, l, l, l, l, l, l, l,
be greater than the corresponding unadjusted R-Squared value. -- Answer ✔✔ False - the
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, adjusted rsquared value take the number and types of predictors into account. It is lower
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, than the r squared value. l, l, l, l,
SOLUTIONS
Least Square Elimination (LSE) cannot be applied to GLM models. -- Answer ✔✔ False - it is
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, applicable but does not use data distribution information fully. l, l, l, l, l, l, l, l,
In multiple linear regression with idd and equal variance, the least squares estimation of
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, regression coefficients are always unbiased. -- Answer ✔✔ True - the least squares l, l, l, l, l, l, l, l, l, l, l, l,
l, estimates are BLUE (Best Linear Unbiased Estimates) in multiple linear regression.
l, l, l, l, l, l, l, l, l, l,
Maximum Likelihood Estimation is not applicable for simple linear regression and multiple
l, l, l, l, l, l, l, l, l, l, l,
l, linear regression. -- Answer ✔✔ False - In SLR and MLR, the SLE and MLE are the same with
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, normal idd data. l, l,
The backward elimination requires a pre-set probability of type II error -- Answer ✔✔ False
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, - Type I error
l, l, l,
The first degree of freedom in the F distribution for any of the three procedures in stepwise
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, is always equal to one. -- Answer ✔✔ True
l, l, l, l, l, l, l, l,
MLE is used for the GLMs for handling complicated link function modeling in the X-Y
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, relationship. -- Answer ✔✔ True l, l, l, l,
In the GLMs the link function cannot be a nonlinear regression. -- Answer ✔✔ False - It can
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, be linear, nonlinear, or parametric
l, l, l, l,
, When the p-value of the slope estimate in the SLR is small the r-squared becomes smaller
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, too. -- Answer ✔✔ False - When P value is small, the model fits become more significant
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, and R squared become larger.
l, l, l, l,
In GLMs the main reason one does not use LSE to estimate model parameters is the
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, potential constrained in the parameters. -- Answer ✔✔ False - The potential constraint in
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the parameters of GLMs is handled by the link function.
l, l, l, l, l, l, l, l, l,
The R-squared and adjusted R-squared are not appropriate model comparisons for
l, l, l, l, l, l, l, l, l, l,
l, nonlinear regression but are for linear regression models. -- Answer ✔✔ TRUE - The
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, underlying assumption of R-squared calculations is that you are fitting a linear model.
l, l, l, l, l, l, l, l, l, l, l, l,
The decision in using ANOVA table for testing whether a model is significant depends on
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the normal distribution of the response variable -- Answer ✔✔ True
l, l, l, l, l, l, l, l, l, l,
When the data may not be normally distributed, AIC is more appropriate for variable
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, selection than adjusted R-squared -- Answer ✔✔ True l, l, l, l, l, l, l,
The slope of a linear regression equation is an example of a correlation coefficient. --
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, Answer ✔✔ False - the correlation coefficient is the r value. Will have the same + or - sign as
l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, the slope. l,
In multiple linear regression, as the value of R-squared increases, the relationship
l, l, l, l, l, l, l, l, l, l, l,
between predictors becomes stronger -- Answer ✔✔ False - r squared measures how
l, l, l, l, l, l, l, l, l, l, l, l,
l, much variability is explained by the model, NOT how strong the predictors are.
l, l, l, l, l, l, l, l, l, l, l, l,
When dealing with a multiple linear regression model, an adjusted R-squared can
l, l, l, l, l, l, l, l, l, l, l,
be greater than the corresponding unadjusted R-Squared value. -- Answer ✔✔ False - the
l, l, l, l, l, l, l, l, l, l, l, l, l,
l, adjusted rsquared value take the number and types of predictors into account. It is lower
l, l, l, l, l, l, l, l, l, l, l, l, l, l,
l, than the r squared value. l, l, l, l,