WEEK 1.
Multiple regression analysis
Correlation:
Analyze —> correlate —> bivariate —> Pearson R.
Good predictor? —> Significant correlation.
Linear regression
Analyze —> regression —> linear
Put dependent variable
Put independent variables —> predictors
Statistics —> Part & Partial correlations & Collinearity diagnostics
Save —> Cook’s & Leverage values
Null hypothesis —> check ANOVA table
Write down as F(two df, regression and residuals) = f statistic, p value
Variance explained by all predictors —> Look at model summary and R squared
Variance per predictor with greatest variance —> coefficients —> correlations —> part —>
biggest number but square it for variance!
Multicollinearity —> dependency between predictors, we want independent variables.
VIF needs to be smaller than 10.
Tolerance < .1
Cook’s distances and Leverage values the presence of outliers
Leverage value: 3(k (number of predictors)+1)/n. You do not want to exceed this value.
Cook’s distance: > 1. Influential outlier.
To see this particular outlier, look in data, click on COO_1 —> right mouse pat,
descending
Do not remove them! But click on data —> select cases —> if condition is satisfied
Make a scatterplot —> ZPRED —> x axis
ZRESID —> y axis
Check normality probability plot
Regression equation —> Bo (constant) + - predictors
, WEEK 2
ANOVA
Cross tab —> Analyze —> descriptives —> cross tabs
Put variables that are asked in rows and columns, order does not matter.
F robustness
Normality —> > 15 group sizes
Homogeneity — > Max group size / min group size = < 1.5
Perform ANOVA —> Analyze —> General Linear Model —> Univariate
Fixed factors —> independent variables
Request profile plot of strategy and gender —> one variable in horizontal axis and one
separate lines. Does not matter which one where, unless specific asked. Click on add.
In options ask for descriptives and homogeneity test.
Save —> standardized residuals —> make a new variable (checking them by making a
histogram in chart builder. Or QQ plot, analyse descriptives —> qq plot
Estimated marginal means —> EM means —> main effects are the variables and
interaction are both.
Homogeneity —> Leven’s test. -> based on mean (you want it to be not significant,
because you want equal variance. Rule of thumb group standard deviations —> under
descriptives table, highest standard deviation / smallest standard deviation < 2.
Eta squared: SS variable x1 / total variance (corrected total)
WEEK 4
LRA
In order to check assumption for collinearity, you need to carry out a linear regression.
Analyze —> regression —> linear
Click in statistics —> collinearity diagnostics.
Make sure all the values are for VIF are under 10, so no evidence for multicollinearity in
predictors
Perform LRA —> analyse —> regression —> binary logistic regression.
Ask in option for iteration history to get constant only model. Predicting Y based on
chance.
Block 0 —> constant only model. (by chance)
Block 1 —> with predictors
Rejecting HO —> Omnibus test of model coefficient
What is the −2 Log likelihood of the constant only model (model with no predictors)?
Multiple regression analysis
Correlation:
Analyze —> correlate —> bivariate —> Pearson R.
Good predictor? —> Significant correlation.
Linear regression
Analyze —> regression —> linear
Put dependent variable
Put independent variables —> predictors
Statistics —> Part & Partial correlations & Collinearity diagnostics
Save —> Cook’s & Leverage values
Null hypothesis —> check ANOVA table
Write down as F(two df, regression and residuals) = f statistic, p value
Variance explained by all predictors —> Look at model summary and R squared
Variance per predictor with greatest variance —> coefficients —> correlations —> part —>
biggest number but square it for variance!
Multicollinearity —> dependency between predictors, we want independent variables.
VIF needs to be smaller than 10.
Tolerance < .1
Cook’s distances and Leverage values the presence of outliers
Leverage value: 3(k (number of predictors)+1)/n. You do not want to exceed this value.
Cook’s distance: > 1. Influential outlier.
To see this particular outlier, look in data, click on COO_1 —> right mouse pat,
descending
Do not remove them! But click on data —> select cases —> if condition is satisfied
Make a scatterplot —> ZPRED —> x axis
ZRESID —> y axis
Check normality probability plot
Regression equation —> Bo (constant) + - predictors
, WEEK 2
ANOVA
Cross tab —> Analyze —> descriptives —> cross tabs
Put variables that are asked in rows and columns, order does not matter.
F robustness
Normality —> > 15 group sizes
Homogeneity — > Max group size / min group size = < 1.5
Perform ANOVA —> Analyze —> General Linear Model —> Univariate
Fixed factors —> independent variables
Request profile plot of strategy and gender —> one variable in horizontal axis and one
separate lines. Does not matter which one where, unless specific asked. Click on add.
In options ask for descriptives and homogeneity test.
Save —> standardized residuals —> make a new variable (checking them by making a
histogram in chart builder. Or QQ plot, analyse descriptives —> qq plot
Estimated marginal means —> EM means —> main effects are the variables and
interaction are both.
Homogeneity —> Leven’s test. -> based on mean (you want it to be not significant,
because you want equal variance. Rule of thumb group standard deviations —> under
descriptives table, highest standard deviation / smallest standard deviation < 2.
Eta squared: SS variable x1 / total variance (corrected total)
WEEK 4
LRA
In order to check assumption for collinearity, you need to carry out a linear regression.
Analyze —> regression —> linear
Click in statistics —> collinearity diagnostics.
Make sure all the values are for VIF are under 10, so no evidence for multicollinearity in
predictors
Perform LRA —> analyse —> regression —> binary logistic regression.
Ask in option for iteration history to get constant only model. Predicting Y based on
chance.
Block 0 —> constant only model. (by chance)
Block 1 —> with predictors
Rejecting HO —> Omnibus test of model coefficient
What is the −2 Log likelihood of the constant only model (model with no predictors)?