This lesson explains how to use matrix methods to transform raw scores to deviation scores. We show the transformation to deviation
scores for vectors and for matrices.
Deviation Scores: Vectors
A deviation score is the difference between a raw score and the mean.
d i = xi - x
where
di is the deviation score for the ith observation in a set of observations
xi is the raw score for the ith observation in a set of observations
x is the mean of all the observations in a set of observations
Often, it is easier to work with deviation scores than with raw scores. Use the following formula to transform a vector of n raw scores
into a vector of n deviation scores.
d = x - 1'x1 ( 1'1 )-1 = x - 1'x1 ( 1/n )
where
1 is an n x 1 column vector of ones
d is an n x 1 column vector of deviation scores: d1, d2, . . . , dn
x is an n x 1 column vector of raw scores: x1, x2, . . . , xn
To show how this works, let's transform the raw scores in vector x to deviation scores in vector d. For this example, let x' = [ 1 2 3 ].
d = x - 1' x 1 ( 1' 1 )-1
1 1 1 1
d = 2 - [111] 2 1 ( [111] 1 )-1
3 3 1 1
1 2 -1
d = 2 - 2 = 0
3 2 1
Note that the mean deviation score is zero.
Deviation Scores: Matrices
Let X be an r x c matrix holding raw scores; and let x be the corresponding r x c matrix holding deviation scores.
When transforming raw scores from X into deviation scores for x, we often want to compute deviation scores separately within column
consistent with the equation below.
xrc = Xrc - Xc
where