2. (7 pt) Recall that the variance of a random variable X is defined by Var(X)...
4. Recall that the covariance of random variables X, and Y is defined by Cov(X,Y) = E(X - Ex)(Y - EY) (a) (2pt) TRUE or FALSE (circle one). E(XY) 0 implies Cov(X, Y) = 0. (b) (4 pt) a, b, c, d are constants. Mark each correct statement ( ) Cov(aX, cY) = ac Cov(X, Y) ( ) Cor(aX + b, cY + d) = ac Cov(X, Y) + bc Cov(X, Y) + da Cov(X, Y) + bd ( )...
Recall that the variance of a random variable is defined as Var[X]=E[(X−μ)2], where μ = E[X]. Use the properties of expectation to show that we can rewrite the variance of a random variable X as Var [X]=E[X^2]−(E[X])^2 Problem 3. (1 point) Recall that the variance of a random variable is defined as Var X-E(X-μ)21, where μ= E[X]. Use the properties of expectation to show that we can rewrite the variance of a random variable X as u hare i- ElX)L...
Q2. More about operations with expectation and covariances Recall that the variance of random variable X is defined as Var(X) Ξ E 1(X-E(X))2」, the covariance is Cor(X, Y-E (X-E(X))(Y-E(Y)), and the correlation is Corr(X,Y) Ξ (a) What is the value of EX-E(X))? (Hint: Let μ denote E(X). Then, the parameter μ is a unknown, but fixed value like a constant.) (0.5 pt) b) The following is the proof that Var(X) E(X2) E(X)2: -E(x)-E(x)2 In a similar way, prove that Cov(X,...
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
Prove the following properties using the definition of the variance and the covariance: Q1. Operations with expectation and covariances Recall that the variance of randon variable X is defined as Var(X) Ξ E [X-E(X))2], the covariance is Cov(X, ) EX E(X))Y EY) As a hint, we can prove Cov(aX + b, cY)-ac Cov(X, Y) by ac EX -E(X)HY -E(Y)ac Cov(X, Y) In a similar manner, prove the following properties using the definition of the variance and the covariance: (a) Var(X)-Cov(X,...
5. Suppose X and Y are random variables such that E(X)=E(Y) = θ, Var(X) = σ and Var(Y)-吆 . Consider a new random variable W = aX + (1-a)Y (a) Show that W is unbiased for θ. (b) If X and Y are independent, how should the constant a be chosen in order to minimize the variance of W?
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
(3) Let X = (X1, X2) be a two-dimensional random vector with variance Var[X= 121 12] Compute Covſa, Xi +a X2, 6, X1 + b2 X2], where an, az, bi, by are given constants.
For constants a and b, X and Y are random variables. Please prove that, var(aX + bY ) = a 2 var(X) + b 2 var(Y ) + 2abcov(X, Y ) If X and Y are uncorrelated, what will be the results?