Prove, Var(ay)= a^2 var(y) Var(y+a)= var(y) Var(x+y)= var(x)+var(y)+2cov(x,y)
04. Prove that Var 1/-e + + m te, I in cylindrical coordinates. in upberical coordinates. Orrekin
Prove that MSE(Ô) = Var() + Bias(0)2, i.e., E[(ôn – 0)21 = E[(ên – E(0))21 + [E(Ô) – 012.
Let x be a continuous random variable. Prove var(x) .
(assuming var(x) exist)
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
Obtain E(Z|X), Var(Z|X) and verify that E(E(Z|X)) =E(Z),
Var(E(Z|X))+E(Var(Z|X)) =Var(Z)
3. Let X, Y be independent Exponential (1) random variables. Define 1, if X Y<2 Obtain E (Z|X), Var(ZX) and verify that E(E(Zx)) E(Z), Var(E(Z|X))+E(Var(Z|X)) - Var(Z)
a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the ith observation is deleted is d'B-d'B 021. Consider a = d'Ce re C = (X'X)-1x{. ii
a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the...
For constants a and b, X and Y are random variables. Please prove that, var(aX + bY ) = a 2 var(X) + b 2 var(Y ) + 2abcov(X, Y ) If X and Y are uncorrelated, what will be the results?
C. (Theory) • Prove that if X Exp(x) for some > 0, ² = Var(x) = 1 / 2
Let X be a continuous random variable with the following density function. Find E(X) and var(X). 6e -7x for x>0 f(x) = { for xso 6 E(X) = 49 var(X) =