1. All regularized regression approaches can be used for variable selection. - False
2. Penalization in linear regression models means penalizing for complex models, that is, models with a
large number of predictors. - True
3. Elastic net regression uses both penalties of the ridge and lasso regression and hence combines the
benefits of both. - True
4. Variable selection can be applied to regression problems when the number of pre- dicting variables is
larger than the number of observations. - True
5. The lasso regression performs well under multicollineariy. - False
6. The selected variables using best subset regression are the best ones in explaining and predicting the
response variables. - False
8. The lasso regression requires a numerical algorithm to minimize the penalized sum of least squares. -
True
9. An unbiased estimator of the prediction risk is the training risk. - False
10. Backward and forward stepwise regression will generally provide different sets of selected variables
when p, the number of predicting variables, is large. - True
11. If there are variables that need to be used to control the bias selection in the model, they should
forced to be in the model and not being part of the variable selection process. - True