BUAL 2650 Final Exam - Lee Questions with
Correct Answers
Save
Practice questions for this set
Learn
Studied 7 terms
Nice work, you're crushing it
Continue studying in Learn
Terms in this set (54)
Adjusted R-squared imposes a 'penalty' for each new
Adjusted R-squared term that's added to the model in an attempt to make
models of different sizes comparable
Adding new predictor variables will always keep the
R-squared value the same or increase it. But, even if
Adjusted R-squared vs. R-
the R-squared value grows, that does not mean that
squared
the resulting model is a better model of that is has
greater predictive ability.
, When values at time, t, are correlated with values at
time, t-1, we say the values are autocorrelated in the
Autocorrelation first order. If values are correlated with values two
time periods back, we say second-order
autocorrelation is present, and so on.
Autoregression and P- large p-values (ex. .870 and .699) means that the
values values are not significant
no matter how strong the association, no matter how
large the r-squared value, there is no way to conclude
Causality Warning
that for a regression alone that one variable caused
the other
1. Linearity Condition
2. Randomization condition
Conditions for MRM
3. No outliers
4. Plot thicken
Regular cycles in the data with periods longer than
Cyclical Components
one year
-assumes there is a linear relationship beyond the
range of the data
Dangers of Extrapolation
-assumes that nothing about the relationship between
x and y changes at extreme values of x
Dangers of R-squared
Dummy variable assigning a value to indicate categorical variables
-can detect first-order autocorrelation from the
residuals of a regression analysis
-it estimates the autocorrelation by summing r-
Durbin-Watson statistic
squared comparing the sum with its expected value
under the null hypothesis of no autocorrelation
-under 2: positive autocorrelation
Correct Answers
Save
Practice questions for this set
Learn
Studied 7 terms
Nice work, you're crushing it
Continue studying in Learn
Terms in this set (54)
Adjusted R-squared imposes a 'penalty' for each new
Adjusted R-squared term that's added to the model in an attempt to make
models of different sizes comparable
Adding new predictor variables will always keep the
R-squared value the same or increase it. But, even if
Adjusted R-squared vs. R-
the R-squared value grows, that does not mean that
squared
the resulting model is a better model of that is has
greater predictive ability.
, When values at time, t, are correlated with values at
time, t-1, we say the values are autocorrelated in the
Autocorrelation first order. If values are correlated with values two
time periods back, we say second-order
autocorrelation is present, and so on.
Autoregression and P- large p-values (ex. .870 and .699) means that the
values values are not significant
no matter how strong the association, no matter how
large the r-squared value, there is no way to conclude
Causality Warning
that for a regression alone that one variable caused
the other
1. Linearity Condition
2. Randomization condition
Conditions for MRM
3. No outliers
4. Plot thicken
Regular cycles in the data with periods longer than
Cyclical Components
one year
-assumes there is a linear relationship beyond the
range of the data
Dangers of Extrapolation
-assumes that nothing about the relationship between
x and y changes at extreme values of x
Dangers of R-squared
Dummy variable assigning a value to indicate categorical variables
-can detect first-order autocorrelation from the
residuals of a regression analysis
-it estimates the autocorrelation by summing r-
Durbin-Watson statistic
squared comparing the sum with its expected value
under the null hypothesis of no autocorrelation
-under 2: positive autocorrelation