100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Summary

Summary Research Methods - Data Analysis I

Rating
-
Sold
-
Pages
2
Uploaded on
21-01-2021
Written in
2020/2021

Summary of 2 pages for the course Research Methods In Psychology at UT

Institution
Course








Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
Study
Course

Document information

Uploaded on
January 21, 2021
Number of pages
2
Written in
2020/2021
Type
Summary

Subjects

Content preview

RM | Unit 140 - Multiple Linear Regression


Book: Analysing Data Using Linear Models
Chapter 4: 4.15, 4.16, 4.17


Chapter 4.15: Multiple regression in R
Check the book on how to do this in R.


Chapter 4.16: Multicollinearity
In general, if you add independent variables to a regression equation, the proportion explained variance,
R2, increases. Suppose you have the following three regression equations: weight = b0 + b1 × volume + e
(4.39) weight = b0 + b1 × area + e (4.40) weight = b0 + b1 × volume + b2 × area + e (4.41) 136 If we
carry out these three analyses, we obtain an R2 of 0.8026346 if we only use volume as predictor, and an
R2 of 0.1268163 if we only use area as predictor.
→ If we carry out these three analyses, we obtain an R2 of 0.8026346 if we only use volume

as a predictor, and an R2 of 0.1268163 if we only use the area as a predictor. So perhaps you’d think

that if we take both volume and area as predictors in the model, we

would get an R2 of 0.8026346+ 0.1268163 = 0.9294509. However,

if we carry out the multiple regression with volume and area, we

obtain an R2 of 0.9284738, which is slightly less! This is not a

rounding error but results from the fact that there is a correlation

between the volume of a book and the area of a book. Here it is a

tiny correlation of 0.002, but nevertheless, it affects the proportion

of variance explained when you use both these variables.

→ Let’s look at what happens when independent variables are strongly correlated. (See

picture)

→ When two predictor variables are perfectly correlated, either 1 or -1, regression is no

longer possible, the software stops and you get a warning. We call such situation multicollinearity.
But also if the correlation is close to 1 or -1, you should be very careful interpreting the regression
parameters. If this happens, try to find out what variables are highly correlated, and select the variable
that makes the most sense.
$3.58
Get access to the full document:

100% satisfaction guarantee
Immediately available after payment
Both online and in PDF
No strings attached


Also available in package deal

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
kayleighdebruin1 Hogeschool Arnhem en Nijmegen
Follow You need to be logged in order to follow users or courses
Sold
43
Member since
7 year
Number of followers
25
Documents
46
Last sold
7 months ago

5.0

3 reviews

5
3
4
0
3
0
2
0
1
0

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions