UPDATE|COMPREHENSIVE QUESTIONS AND
Least Square Elimination (LSE) cannot be
VERIFIED SOLUTIONS/CORRECT ANSWERS|GET
applied to GLM models.
100% ACCURATE!!
False - it is applicable but does not use data
distribution information fully.
In multiple linear regression with idd and equal
variance, the least squares estimation of
regression coefficients are always unbiased.
True - the least squares estimates are BLUE
(Best Linear Unbiased Estimates) in multiple
Practice questions for this set linear regression.
Learn Maximum Likelihood Estimation is not
applicable for simple linear regression and
1/7 multiple linear regression.
Study using Learn False - In SLR and MLR, the SLE and MLE are the
True - the least squares estimates are BLUE same with normal idd data.
(Best Linear Unbiased Estimates) in multiple The backward elimination requires a pre-set
linear regression. probability of type II error
Select the correct term False - Type I error
1 The first degree of freedom in the F distribution
. A correlation coefficient close to 1 is evidence for any of the three procedures in stepwise is
of a cause-and-effect relationship always equal to one.
between the two variables. True
2 MLE is used for the GLMs for handling
In multiple linear regression with idd and equal complicated link function modeling in the X-Y
variance, the least squares estimation of relationship.
regression coefficients are always unbiased. True
3 In the GLMs the link function cannot be a non
L2 penalty term measures sparsity linear regression.
4 False - It can be linear, non linear, or parametric
In Poisson Regression we do not interpret beta When the p-value of the slope estimate in the
with respect to the response variable but with SLR is small the r-squared becomes smaller too.
respect to the ratio of the rate. False - When P value is small, the model fits
Don't know? become more significant and R squared become
larger.
Terms in this set (111)
, In GLMs the main reason one does not use LSE In a multiple regression problem, a quantitative
to estimate model parameters is the potential input variable x is replaced by x −
constrained in the parameters. mean(x). The R-squared for the fitted model will
be the same
False - The potential constraint in the
parameters of GLMs is handled by the link True
function.
The estimated coefficients of a regression line is
The R-squared and adjusted R-squared are not positive, when the coefficient of
appropriate model comparisons for non linear determination is positive.
regression but are for linear regression models.
False - r squared is always positive.
TRUE - The underlying assumption of R-squared
If the outcome variable is quantitative and all
calculations is that you are fitting a linear
explanatory variables take values 0 or
model.
1, a logistic regression model is most
The decision in using ANOVA table for testing appropriate.
whether a model is significant depends on the
False - More research is necessary to determine
normal distribution of the response variable
the correct model.
True
After fitting a logistic regression model, a plot of
When the data may not be normally distributed, residuals versus fitted values is
AIC is more appropriate for variable selection useful for checking if model assumptions are
than adjusted R-squared violated.
True False - for logistic regression use deviance
residuals.
The slope of a linear regression equation is an
example of a correlation coefficient. In a greenhouse experiment with several
predictors, the response variable is the
False - the correlation coefficient is the r value.
number of seeds that germinate out of 60 that
Will have the same + or - sign as the slope.
are planted with different treatment
In multiple linear regression, as the value of R- combinations. A Poisson regression model is
squared increases, the relationship most appropriate for modeling this
between predictors becomes stronger data
False - r squared measures how much variability False - poisson regression models rate or count
is explained by the model, NOT how strong the data.
predictors are.
For Poisson regression, we can reduce type I
When dealing with a multiple linear regression errors of identifying statistical
model, an adjusted R-squared can significance in the regression coefficients by
be greater than the corresponding unadjusted increasing the sample size.
R-Squared value.
True
False - the adjusted rsquared value take the
Both LASSO and ridge regression always provide
number and types of predictors into account. It
greater residual sum of squares
is lower than the r squared value.
than that of simple multiple linear regression.