econometrics Final Exam Questions and Answers
100% Correct
In the equation, y=β_0+β_1 x_1+β_2 x_2+u, β_2 is a(n) _____.
a. independent variable
b. dependent variable
c. slope parameter
d. intercept parameter
c. slope parameter
Consider the following regression equation: y=β_1+β_2 x_1+β_2 x_2+u. What does β1
imply?
a.β_1 measures the ceteris paribus effect of x_1on x_2.
b. β_1 measures the ceteris paribus effect of y on x_1.
c. β_1 measures the ceteris paribus effect of x_1on y.
d. β_1 measures the ceteris paribus effect of x_1on u.
c. β_1 measures the ceteris paribus effect of x_1on y.
If the explained sum of squares is 35 and the total sum of squares is 49, what is the residual
sum of squares?
a. 10
b. 12
c. 18
d. 14
d. 14
,Which of the following is true of R2?
a. R2 is also called the standard error of regression.
b. A low R2 indicates that the Ordinary Least Squares line fits the data well.
c. R2 usually decreases with an increase in the number of independent variables in a
regression.
d. R2 shows what percentage of the total variation in the dependent variable, Y, is
explained by the explanatory variables.
d. R2 shows what percentage of the total variation in the dependent variable, Y, is explained by
the explanatory variables.
The value of R2 always _____.
a. lies below 0
b. lies above 1
c. lies between 0 and 1
d. lies between 1 and 1.5
c. lies between 0 and 1
If an independent variable in a multiple linear regression model is an exact linear
combination of other independent variables, the model suffers from the problem of _____.
a. perfect collinearity
b. homoskedasticity
c. heteroskedasticty
d. omitted variable bias
a. perfect collinearity
, The assumption that there are no exact linear relationships among the independent
variables in a multiple linear regression model fails if _____, where n is the sample size and
k is the number of parameters.
a. n>2
b. n=k+1
c. n>k
d. n<k+1
d. n<k+1
Exclusion of a relevant variable from a multiple linear regression model leads to the
problem of _____.
a. misspecification of the model
b. multicollinearity
c. perfect collinearity
d. homoskedasticity
a. misspecification of the model
Suppose the variable x2 has been omitted from the following regression equation,
y=β_0+β_1 x_1+β_2 x_2+u. (β_1 ) ̃ is the estimator obtained when x2 is omitted from the
equation. The bias in (β_1 ) ̃is positive if _____.
a. β_2 >0 and x 1 and x 2 are positively correlated
b. β_2 <0 and x 1 and x 2 are positively correlated
c. β_2 >0 and x 1 and x 2 are negatively correlated
d. β_2 = 0 and x 1 and x 2 are negatively correlated
100% Correct
In the equation, y=β_0+β_1 x_1+β_2 x_2+u, β_2 is a(n) _____.
a. independent variable
b. dependent variable
c. slope parameter
d. intercept parameter
c. slope parameter
Consider the following regression equation: y=β_1+β_2 x_1+β_2 x_2+u. What does β1
imply?
a.β_1 measures the ceteris paribus effect of x_1on x_2.
b. β_1 measures the ceteris paribus effect of y on x_1.
c. β_1 measures the ceteris paribus effect of x_1on y.
d. β_1 measures the ceteris paribus effect of x_1on u.
c. β_1 measures the ceteris paribus effect of x_1on y.
If the explained sum of squares is 35 and the total sum of squares is 49, what is the residual
sum of squares?
a. 10
b. 12
c. 18
d. 14
d. 14
,Which of the following is true of R2?
a. R2 is also called the standard error of regression.
b. A low R2 indicates that the Ordinary Least Squares line fits the data well.
c. R2 usually decreases with an increase in the number of independent variables in a
regression.
d. R2 shows what percentage of the total variation in the dependent variable, Y, is
explained by the explanatory variables.
d. R2 shows what percentage of the total variation in the dependent variable, Y, is explained by
the explanatory variables.
The value of R2 always _____.
a. lies below 0
b. lies above 1
c. lies between 0 and 1
d. lies between 1 and 1.5
c. lies between 0 and 1
If an independent variable in a multiple linear regression model is an exact linear
combination of other independent variables, the model suffers from the problem of _____.
a. perfect collinearity
b. homoskedasticity
c. heteroskedasticty
d. omitted variable bias
a. perfect collinearity
, The assumption that there are no exact linear relationships among the independent
variables in a multiple linear regression model fails if _____, where n is the sample size and
k is the number of parameters.
a. n>2
b. n=k+1
c. n>k
d. n<k+1
d. n<k+1
Exclusion of a relevant variable from a multiple linear regression model leads to the
problem of _____.
a. misspecification of the model
b. multicollinearity
c. perfect collinearity
d. homoskedasticity
a. misspecification of the model
Suppose the variable x2 has been omitted from the following regression equation,
y=β_0+β_1 x_1+β_2 x_2+u. (β_1 ) ̃ is the estimator obtained when x2 is omitted from the
equation. The bias in (β_1 ) ̃is positive if _____.
a. β_2 >0 and x 1 and x 2 are positively correlated
b. β_2 <0 and x 1 and x 2 are positively correlated
c. β_2 >0 and x 1 and x 2 are negatively correlated
d. β_2 = 0 and x 1 and x 2 are negatively correlated