Chapter 13 Nonlinear multiple regression
COMPLETION
1. Multiple regression analysis involves building models for relating dependent variable y to or more
independent variables.
ANS: two
PTS: 1
2. Many statisticians recommend for an assessment of model validity and usefulness. These include plotting
the residuals or standardized residuals on the vertical axis versus the independent variable or fitted values
on the horizontal axis.
ANS: residual plots
PTS: 1
3. If the regression parameters and are estimated by minimizing the expression
, where the ’s are weights that decrease with increasing , this
yields estimates.
ANS: weighted least squares
PTS: 1
, 4. The principle_ selects and to minimize .
ANS: MAD (minimize absolute deviations)
PTS: 1
5. The transformation is used to linearize the function
ANS:
PTS: 1
6. A function relating y to x is if by means of a transformation on x and / or y, the function can be expressed
as , where is the transformed independent variable and is the transformed dependent variable.
ANS: intrinsically linear
PTS: 1
7. For the exponential function , only the variable is transformed via the transformation
to achieve linearity.
ANS: dependent,
PTS: 1
8. The transformation of the dependent variable y and the transformation of the independent
variable x are used to linearize the power function
ANS: ,
PTS: 1
9. The transformation is used to linearize the reciprocal function
ANS:
PTS: 1
10. The additive exponential and power models, and are linear.
ANS: not intrinsically
PTS: 1
,11. The function has been found quite useful in many applications. This function is
well known as the function.
ANS: logit
PTS: 1
12. In logistic regression it can be shown that . The expression on the left-hand side of this equality is
well known as the .
ANS: odds ratio
PTS: 1
13. The kth -degree polynomial regression model equation is , where is a normally
distributed random variable with = and =
ANS: 0,
PTS: 1
14. With , the sum of squared residuals (error sum of squares) is
. Hence the mean square error is MSE =_ /_ .
ANS: SSE, n-(k+1)
PTS: 1
15. If we let , and , then SSE/SST is the proportion of the total variation in the
observed ’s that is by the polynomial model.
ANS: not explained
PTS: 1
16. If we let , and , then 1-SSE/SST is the proportion of the total variation in the
observed ’s that is by the polynomial model. It is called the ,and is denoted by R .
ANS: explained, coefficient of multiple determination
PTS: 1
17. In general, with is the error sum of squares from a kth degree polynomial, , and
whenever > k.
ANS: ,
, PTS: 1
18. If = .75 is the value of the coefficient of multiple determination from a cubic regression model and that n =15,
then the adjusted value is .
ANS: .68
PTS: 1
19. The regression coefficient in the multiple regression model is interpreted
as the expected change in associated with a 1-unit increase in ,while_ are held
fixed.
ANS: Y, ,
PTS: 1
20. A dichotomous variable, one with just two possible categories, can be incorporated into a regression model via a
or variable x whose possible values 0 and 1 indicate which category is relevant for any
particular observations.
ANS: dummy, indicator
PTS: 1
21. Incorporating a categorical variable with 5 possible categories into a multiple regression model requires the use of
dummy variables.
ANS: 4
PTS: 1
22. Inferences concerning a single parameter in a multiple regression model with 5 predictors and 25 observations are
based on a standardized variable T which has a t distribution with degrees of freedom.
ANS: 19
PTS: 1
23. If a data set on at least five predictors is available, regressions involving all possible subsets of the predictors involve
at least different models
ANS: 32
PTS: 1
24. A multiple regression model with k predictors will include regression parameters, because will
always be included.
ANS: k + 1
COMPLETION
1. Multiple regression analysis involves building models for relating dependent variable y to or more
independent variables.
ANS: two
PTS: 1
2. Many statisticians recommend for an assessment of model validity and usefulness. These include plotting
the residuals or standardized residuals on the vertical axis versus the independent variable or fitted values
on the horizontal axis.
ANS: residual plots
PTS: 1
3. If the regression parameters and are estimated by minimizing the expression
, where the ’s are weights that decrease with increasing , this
yields estimates.
ANS: weighted least squares
PTS: 1
, 4. The principle_ selects and to minimize .
ANS: MAD (minimize absolute deviations)
PTS: 1
5. The transformation is used to linearize the function
ANS:
PTS: 1
6. A function relating y to x is if by means of a transformation on x and / or y, the function can be expressed
as , where is the transformed independent variable and is the transformed dependent variable.
ANS: intrinsically linear
PTS: 1
7. For the exponential function , only the variable is transformed via the transformation
to achieve linearity.
ANS: dependent,
PTS: 1
8. The transformation of the dependent variable y and the transformation of the independent
variable x are used to linearize the power function
ANS: ,
PTS: 1
9. The transformation is used to linearize the reciprocal function
ANS:
PTS: 1
10. The additive exponential and power models, and are linear.
ANS: not intrinsically
PTS: 1
,11. The function has been found quite useful in many applications. This function is
well known as the function.
ANS: logit
PTS: 1
12. In logistic regression it can be shown that . The expression on the left-hand side of this equality is
well known as the .
ANS: odds ratio
PTS: 1
13. The kth -degree polynomial regression model equation is , where is a normally
distributed random variable with = and =
ANS: 0,
PTS: 1
14. With , the sum of squared residuals (error sum of squares) is
. Hence the mean square error is MSE =_ /_ .
ANS: SSE, n-(k+1)
PTS: 1
15. If we let , and , then SSE/SST is the proportion of the total variation in the
observed ’s that is by the polynomial model.
ANS: not explained
PTS: 1
16. If we let , and , then 1-SSE/SST is the proportion of the total variation in the
observed ’s that is by the polynomial model. It is called the ,and is denoted by R .
ANS: explained, coefficient of multiple determination
PTS: 1
17. In general, with is the error sum of squares from a kth degree polynomial, , and
whenever > k.
ANS: ,
, PTS: 1
18. If = .75 is the value of the coefficient of multiple determination from a cubic regression model and that n =15,
then the adjusted value is .
ANS: .68
PTS: 1
19. The regression coefficient in the multiple regression model is interpreted
as the expected change in associated with a 1-unit increase in ,while_ are held
fixed.
ANS: Y, ,
PTS: 1
20. A dichotomous variable, one with just two possible categories, can be incorporated into a regression model via a
or variable x whose possible values 0 and 1 indicate which category is relevant for any
particular observations.
ANS: dummy, indicator
PTS: 1
21. Incorporating a categorical variable with 5 possible categories into a multiple regression model requires the use of
dummy variables.
ANS: 4
PTS: 1
22. Inferences concerning a single parameter in a multiple regression model with 5 predictors and 25 observations are
based on a standardized variable T which has a t distribution with degrees of freedom.
ANS: 19
PTS: 1
23. If a data set on at least five predictors is available, regressions involving all possible subsets of the predictors involve
at least different models
ANS: 32
PTS: 1
24. A multiple regression model with k predictors will include regression parameters, because will
always be included.
ANS: k + 1