Description
1 Definitions
1. Let
X ,µ= E (X) (1.1.1)
Σx = E(1.1.2)
Then Σx is defined to be the covariance x.
2. For x ∼ N (µ,Σx),
fX(x) = | ((x − µ)⊤Σx−1(x − µ)) (1.2.1)
|
4. Show that
Σz =PΣxP⊤ (2.4.1)
3. The correlation coefficient is defined as
(1.3.1)
where µi,σ2i are the mean and variance of xi.
2 Problems
1. Show that
E (x − µ)(x − µ)⊤] = ( σ121 2 ρσ12σ2) (2.1.1)
[ ρσ σ σ2
2. Prove that Σx is a diagonal matrix when x1 and x2 are independent.
3. Let
z1 = x1 + x2 (2.3.1)
Find P such that z2 = x1 − x2 (2.3.2)
z Px (2.3.3)
2
5. Check the independence of z1 and z2 given that σ1 = σ2.
6. Show that columns of P are eigenvectors of Σz.
7. Show that the eigenvectors of Σz are orthogonal to each other.
8. Summarize your conclusion in one line.
Reviews
There are no reviews yet.