The covariance matrix .
It is given the . But (Since expectation is linear)
Where is made up of orthonormal vectors of .
Now
Consider a random vector X e RP with mean EX is a p x p dimensional...
Q4 Suppose we want to find a vector a to project X E RP to 1-dimensional space such that a = argmax, Var(a?'X). Prove that a is the eigenvector corre- sponding to the largest eigenvalue of the covariance matrix E.
Suppose that x is a p-dimensional random vector with mean y and covariance ? = UDUt where U = Ui U2 ... Up li dz ... D = de with u1,...,Up orthonormal. Show that S di Cov(u: (x – u), uf(x – u)) = { 0 i= otherwise
3. For n 2 2, let X have n-dimensional normal distribution MN(i, V). For any 1 3 m < n, let X1 denote the vector consisting of the last n - m coordinates of X < n, let 1 (a). Find the mean vector and the variance-covariance matrix of X1. (b). Show that Xi is a (n- m)-dimensional normal random vector.
1.4 Suppose x is a random vector drawn from a d-dimensional multivariate Gassian distribution with mean 0 and covariance Σ Define y := Qx + u, for a known (invertible) d × d matrix Q, and a dx 1 vector v. What is the distribution of y?
Some Extra Definitions Recall that, for a nonrandom real number c, and a random variable X, we have Var (cX) = e Var (X). In this problem we'll generalize this property to linear combinations! Let be a vector of real nonrandom numbers, and let be a vector of random variables (sometimes called a random vector). Last, define the covariance matrix to be the matrix with all the covariances ar- ranged into a matrix. When we talk about taking the taking...
3. Let N = (M, ,X,) be a multinomial (mi pı, pr) random vector. Compute the PT mean and covariance matrix of a N. That is, find E(N) and COV (N, N) for i,j- 1,... ,r. Computing the latter can be done directly (least recommended), by expressing N, as an appropriate sum of Bernoulli RVs, or by looking at N N, 3. Let N = (M, ,X,) be a multinomial (mi pı, pr) random vector. Compute the PT mean and...
Consider two random variables, X and Y. Let E(X) and E(Y) denote the population means of X and Y respectively. Further, let Var(X) and Var(Y) denote the population variances of X and Y. Consider another random variable that is a linear combination of X and Y Z- 3X- Y What is the population variance of Z? Assume that X and Y are independent, which is to say that their covariance is zero.
Exercise 6 (6.4.35, p.452) Let A e Cnxn, and let S be a k-dimensional subspace of C". Then a vector ve S is called a Ritz vector of A from S if and only if there is a pie C such that the Rayleigh-Ritz-Galerkin condition Av – uv Is holds, that is, (Av – uv, s) = 0 for all s E S. The scalar u is called the Ritz value of A associated with v. Let 91, ...,qk be...
Suppose X is a random vector, where X = (X(1), . . . , x(d))T , d with mean 0 and covariance matrix vv1 , for some vector v ER 1point possible (graded) Let v = . (i.e., v is the normalized version of v). What is the variance of v X? (If applicable, enter trans(v) for the transpose v of v, and normv) for the norm |vll of a vector v.) Var (V STANDARD NOTATION SubmitYou have used 0...
Calculate the following for the random vector (XY) with joint pdf fixy)--(3/4)(x+y) if 2x<yco, -1<x<o. 1. The marginal pdf of X and the marginal pdf of Y. Are X and Y independent random variables? 2. The expected value and variance of X and Y respectively. 3. The joint cdf in the case 2x<y<0. -1<x<0. 4. The expected value of the random variable Z defined as X^2 times Y^2. 5. The covariance between X and Y. 6. The expected value and...