Random Variables and Expectation

I. Sample Statistics

X = (n x p) data matrix with n rows of subjects and p measurements.

J = (n x n) matrix of all ones.

Sample mean matrix:

M = (JX)/n

mjk = (x1k + x2k + … + xnk)/n = mk

Sample Covariance Matrix:

S = (X-M)'(X-M)/(n-1)

sjk = [(x1j - mj)(x1k - mk) + (x2j - mj)(x2k - mk)+

…+ (xnj - mj)(xnk - mk)]/(n-1)

Sample Correlation Matrix:

sk = Sqrt(skk)

D = Diag[ s1 , s2, …, sp ]

R = D-1SD-1

rjk = sjk / sj sk

 

Mean and Variance of a Linear Transformation

Define X as a n x p data matrix , n = no. obs, p = no. var's

Construct a new set of variables

Y = XB

Mx = JX/n

Sx = (X-Mx)'(X-Mx)/(n-1)

Then

My = JY/n = J(XB)/n = (JX/n)B = MxB

Sy = (Y-My)'(Y-My)/(n-1)

   = (XB - MxB)'(XB - MxB)/(n-1)

   = [(X-M)B]'[(X-Mx)B]/(n-1)

   = B'(X-Mx)'(X-Mx)B/(n-1)

   = B'SxB

 

II. Expectation

X is a random variable that can be assigned one of the values

x1 , x2, …xj , …xM

Pr[X = xj ] = probability that r.v. X takes on the value xj .

Mean of X:

E[X] = x1Pr[X=x1] + x2Pr[X=x2] +

… + xMPr[X=xM]

Two rules of expectation for random variables X and Y :

E[c X] = cE[X]

E[ X + Y ] = E[X] + E[Y]

Random Vectors X and Y.

X' =[X1 X2 … Xp ] = (1 x p) row vector of p random variables

Y' =[Y1 Y2 … Yq ] = (1 x q) row vector of q random variables

E[X'] = [ E[X1] E[X2] … E[Xp] ] = E[X]'

Cov(X,Y) = E[ (X-E[X])(Y - E[Y])' ]

Two rules of expectation for random vectors X and Y:

E[C Y] = C E[Y]

Cov[ BX, CY] = BCov(X,Y)C'

Note: Same two rules apply to sample means and covariances.