4. (a) For the random variable X, show that E[(x -a)?] is minimized when a =...
4. (a) For the random variable X, show that E[(x - a)?] is minimized when a = E(X). (b) For random variables X and Y, show that Var(X+Y) S Var(X) + Var(Y), that is, the standard deviation of the sum is less than or equal to the sum of standard deviations. (c) For random variables X and Y, prove the Cauchy-Schwartz Inequality: [E(XY)]2 < E(X) E(Y2)
(a) For the random variable X, show that E[(x – a)?] is minimized when a = E(X). (b) For random variables X and Y, show that Var(X + Y) < Var(x) + Var(Y), that is, the standard deviation of the sum is less than or equal to the sum of standard deviations. (c) For random variables X and Y, prove the Cauchy-Schwartz Inequality: [E(XY)]? 5 E(X2) E(Y2)
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
5. Let X be a non-negative integer-valued random variable with positive expectation. Prove that E X2] (Hint: Use the following special case of the Cauchy-Schwarz Inequality: First, make sure you see why this is a special case of the Cauchy-Schwarz Inequality; then apply it to get one of the inequalities of this problem.)
5. Let X be a non-negative integer-valued random variable with positive expectation. Prove that E X2] (Hint: Use the following special case of the Cauchy-Schwarz Inequality: First,...
5. Let X be a non-negative integer-valued random variable with positive expectation. Prove that E X2] (Hint: Use the following special case of the Cauchy-Schwarz Inequality: First, make sure you see why this is a special case of the Cauchy-Schwarz Inequality; then apply it to get one of the inequalities of this problem.)
5. Let X be a non-negative integer-valued random variable with positive expectation. Prove that E X2] (Hint: Use the following special case of the Cauchy-Schwarz Inequality: First,...
der two independent random variables X and Y with the following 11. Consi means and standard deviations: = 60; ơv_ 15. (a) Find E(x + Y), Var(X + Y), E(X Y), Var(X - Y). (b) If x* and Y* are the standardized r.v.'s eorresponding to the r.v.'s X and Y, respectively, determine E(X* + Y*), E(X*-Y*), Var(X* Y*), Var(x* - Y*)
der two independent random variables X and Y with the following 11. Consi means and standard deviations: = 60;...
5) Let X be a random variable with mean E(X) = μ < oo and variance Var(X) = σ2メ0. For any c> 0, This is a famous result known as Chebyshev's inequality. Suppose that Y,%, x, ar: i.id, iandool wousblsxs writia expliiniacy" iacai 's(%) fh o() airl íinic vaikuitx: Var(X) = σ2メ0. With Υ = n Ση1 Y. show that for any c > 0 Tsisis the celebraed Weak Law of Large Numben
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) = E(X)+ E(Y). (2) Prove Var(X + Y) = Var(X) + Var(Y)2Cov(X, Y). (3) Prove Cov(X, Y) E(XY)- E(X)E(Y). (4) Prove that if X and Y are independent, i.e., f(x, y) Cov(X, Y) 0. Is the reverse true? (5) Prove Cov (aX b,cY + d) = acCov(X, Y). (6) Prove Cov(X, X) = Var(X) fx (x)fy(y) for any (x,y), then =