answers passed
1. If there are variables that need to be used to control the bias selection in the model, they
should forced to be in the model and not being part of the variable selection process. - correct
answer ✔✔True
2. Penalization in linear regression models means penalizing for complex models, that is, models
with a large number of predictors. - correct answer ✔✔True
3. Elastic net regression uses both penalties of the ridge and lasso regression and hence
combines the benefits of both. - correct answer ✔✔True
4. Variable selection can be applied to regression problems when the number of pre- dicting
variables is larger than the number of observations. - correct answer ✔✔True
5. The lasso regression performs well under multicollineariy. - correct answer ✔✔False
6. The selected variables using best subset regression are the best ones in explaining and
predicting the response variables. - correct answer ✔✔False
8. The lasso regression requires a numerical algorithm to minimize the penalized sum of least
squares. - correct answer ✔✔True
9. An unbiased estimator of the prediction risk is the training risk. - correct answer ✔✔False
, 10. Backward and forward stepwise regression will generally provide different sets of selected
variables when p, the number of predicting variables, is large. - correct answer ✔✔True
11. All regularized regression approaches can be used for variable selection. - correct answer
✔✔False
12. Before performing regularized regression, we need to standardize or rescale the pre- dicting
variables. - correct answer ✔✔True
13. The larger the number of predicting variables is, the larger the bias but the smaller the
variance is. - correct answer ✔✔False
14. Variable selection is a simple and solved statistical problem since we can implement it using
the R statistical software. - correct answer ✔✔False
15. BIC penalizes for complexity of the model more than AIC or Mallow's Cp statistics. - correct
answer ✔✔True
16. The penalty constant λ in penalized or regularized regression controls the trade-off between
lack of fit and model complexity. - correct answer ✔✔True
17. We cannot perform variable selection based on the statistical significance of the regression
coefficients. - correct answer ✔✔True
18. Akaike Information Criterion is an estimate for the prediction risk. - correct answer ✔✔True
19. Forward stepwise regression is a greedy algorithm searching through all possible
combinations of the predicting variables. - correct answer ✔✔False