Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following.
a. E [aX + bY] = aµx + bµy for any constants a and b.
b. Var[X2] = E[X2] − E[X]2
c. Var [aX] = a2Var [X] for any constant a.
d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]).
e. Use the previous part and the assumption that X and Y are independent to show that Var [aX + bY ] = a2 Var [X] + b2 Var [Y ] for any constants a and b.
for your clarification ragarding proof b:
Var(X2i)=E[(X2i)2]−(E[X2i])2Var(Xi2)=E[(Xi2)2]−(E[Xi2])2
Var(X2i)=E[X4i]−(E[X2i])2Var(Xi2)=E[Xi4]−(E[Xi2])2
E[X4i]E[Xi4] is the 4th moment
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E...
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
For constants a and b, X and Y are random variables. Please prove that, var(aX + bY ) = a 2 var(X) + b 2 var(Y ) + 2abcov(X, Y ) If X and Y are uncorrelated, what will be the results?
(10 marks) Let X1, X2,... be a sequence of independent and identically distributed random variables with mean EX1 = i and VarX1 = a2. Let Yı, Y2, ... be another sequence of independent and identically distributed random variables with mean EY = u and VarY1 a2 Define the random variable ( ΣxΣ) 1 Dn 2ng2 i= i=1 Prove that Dn converges in distribution to a standard normal distribution, i.e., prove that 1 P(Dn ) dt 2T as n >oo for...
15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.] 15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.]
9. Let X and Y be independent and identically distributed random variables with mean u and variance o. Find the following: (a) E[(x + 2)] (b) Var(3x + 4) (c) E[(X-Y)] (d) Cov{(X + Y), (X - Y)}
Let Xi, X2,... , Xn denote independent and identically distributed uniform random variables on the interval 10, 3β) . Obtain the maxium likelihood estimator for B, B. Use this estimator to provide an estimate of Var[X] when r1-1.3, x2- 3.9, r3-2.2
18. Let X, X2, ..., Xv are independent and identically distributed standard uniform random variables. Find the following expectations: (a) E[max(X,,X2, .Xn,)] (b) E[min(X1,X2,..., Xn)]
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
der two independent random variables X and Y with the following 11. Consi means and standard deviations: = 60; ơv_ 15. (a) Find E(x + Y), Var(X + Y), E(X Y), Var(X - Y). (b) If x* and Y* are the standardized r.v.'s eorresponding to the r.v.'s X and Y, respectively, determine E(X* + Y*), E(X*-Y*), Var(X* Y*), Var(x* - Y*) der two independent random variables X and Y with the following 11. Consi means and standard deviations: = 60;...
Let X1 and X2 be independent random variables with means μ1 and μ2, and variances σ21 and σ22, respectively. Find the correlation of X1 and X1 + X2. Note that: The covariance of random variables X; Y is dened by Cov(X; Y ) = E[(X - E(X))(Y - E(Y ))]. The correlation of X; Y is dened by Corr(X; Y ) =Cov(X; Y ) / √ Var(X)Var(Y )