questions well answered
Least Square Elimination (LSE) cannot be applied to GLM models. - correct answer ✔✔False - it
is applicable but does not use data distribution information fully.
In multiple linear regression with idd and equal variance, the least squares estimation of
regression coefficients are always unbiased. - correct answer ✔✔True - the least squares
estimates are BLUE (Best Linear Unbiased Estimates) in multiple linear regression.
Maximum Likelihood Estimation is not applicable for simple linear regression and multiple linear
regression. - correct answer ✔✔False - In SLR and MLR, the SLE and MLE are the same with
normal idd data.
The backward elimination requires a pre-set probability of type II error - correct answer
✔✔False - Type I error
The first degree of freedom in the F distribution for any of the three procedures in stepwise is
always equal to one. - correct answer ✔✔True
MLE is used for the GLMs for handling complicated link function modeling in the X-Y
relationship. - correct answer ✔✔True
In the GLMs the link function cannot be a non linear regression. - correct answer ✔✔False - It
can be linear, non linear, or parametric
, When the p-value of the slope estimate in the SLR is small the r-squared becomes smaller too. -
correct answer ✔✔False - When P value is small, the model fits become more significant and R
squared become larger.
In GLMs the main reason one does not use LSE to estimate model parameters is the potential
constrained in the parameters. - correct answer ✔✔False - The potential constraint in the
parameters of GLMs is handled by the link function.
The R-squared and adjusted R-squared are not appropriate model comparisons for non linear
regression but are for linear regression models. - correct answer ✔✔TRUE - The underlying
assumption of R-squared calculations is that you are fitting a linear model.
The decision in using ANOVA table for testing whether a model is significant depends on the
normal distribution of the response variable - correct answer ✔✔True
When the data may not be normally distributed, AIC is more appropriate for variable selection
than adjusted R-squared - correct answer ✔✔True
The slope of a linear regression equation is an example of a correlation coefficient. - correct
answer ✔✔False - the correlation coefficient is the r value. Will have the same + or - sign as the
slope.
In multiple linear regression, as the value of R-squared increases, the relationship
between predictors becomes stronger - correct answer ✔✔False - r squared measures how
much variability is explained by the model, NOT how strong the predictors are.
When dealing with a multiple linear regression model, an adjusted R-squared can
be greater than the corresponding unadjusted R-Squared value. - correct answer ✔✔False - the
adjusted rsquared value take the number and types of predictors into account. It is lower than
the r squared value.