ISYE 6414 Midterm Prep Questions With Correct Answers!!
We can assess the constant variance assumption in linear regression by plotting the residuals vs. fitted values. - True If one confidence interval in the pairwise comparison in ANOVA includes zero, we conclude that the two corresponding means are plausibly equal. - True The assumption of normality is not required in linear regression to make inference on the regression coefficients. - False (Explanation: is required) We cannot estimate a multiple linear regression model if the predicting variables are linearly independent. - False (Explanation: linearly dependent) If a predicting variable is a categorical variable with 5 categories in a linear regression model without intercept, we will include 5 dummy variables. - True If the normality assumption does not hold for a regression, we may use a transformation on the response variable. - True The prediction of the response variable has higher uncertainty than the estimation of the mean response. - True Statistical inference for linear regression under normality relies on large sample size. - False (Explanation: small sample size is fine) A nonlinear relationship between the response variable and a predicting variable cannot be modeled using regression. - False (Explanation: Nonlinear relationships can often be modeled using linear regression by including polynomial terms of the predicting variable, for example.) Assumption of normality in linear regression is required for confidence intervals, prediction intervals, and hypothesis testing. - TrueIf the confidence interval for a regression coefficient contains the value zero, we interpret that the regression coefficient is plausibly equal to zero. - True The smaller the coefficient of determination or R-squared, the higher the variability explained bythe simple linear regression. -False (Explanation: The larger the R-squared) The estimators of the variance parameter and of the regression coefficients in a regression model are random variables. - True The standard error in linear regression indicates how far the data points are from the regression line, on average. - True A linear regression model is a good fit to the data set if the R-squared is above 0.90. - False (Explanation: There are other things to check: assumptions, MSE, etc.) In ANOVA, we assume the variance of the response variable is different for each population. - False (Explanation: is the same across all populations) The F-test in ANOVA compares the between variability versus the within variability. - True In testing for subsets of coefficients in a multiple linear regression, the null hypothesis we test for is that all coefficients are equal; H_0: B_1 = B_2 = ... = B_kf - False (Explanation: The null hypothesis is that all coefficients are equal to zero; none are significant in predicting the response.) The only assumptions for a simple linear regression model are linearity, constant variance, and normality. - False In a simple linear regression model, the variable of interest is the response variable. - TrueThe constant variance assumption is diagnosed by plotting the predicting variable vs. the response variable. - False β 1 is an unbiased estimator for β 0 . - False The estimator σ ^ 2 is a fixed variable. - False
Written for
- Institution
- ISYE 6414
- Course
- ISYE 6414
Document information
- Uploaded on
- November 7, 2023
- Number of pages
- 8
- Written in
- 2023/2024
- Type
- Exam (elaborations)
- Contains
- Questions & answers
Subjects
Also available in package deal