3. The random variables X and Y are independent and identically distributed (iid) according to the...
Let X and Y be two independent and identically distributed random variables with expected value 1 and variance 2.56. First, find a non-trivial upper bound for P(|X + Y − 2| ≥ 1). Now suppose that X and Y are independent and identically distributed N(1,2.56) random variables. What is P(|X + Y − 2| ≥ 1) exactly? Why is the upper bound first obtained so different from the exact probability obtained?
(1 point) If X and Y are independent and identically distributed uniform random variables on (0, 1), compute each of the following joint densities U,v(u, v
9. Let X and Y be independent and identically distributed random variables with mean u and variance o. Find the following: (a) E[(x + 2)] (b) Var(3x + 4) (c) E[(X-Y)] (d) Cov{(X + Y), (X - Y)}
Suppose that X and Y are independent, identically distributed, geometric random variables with parameter p. Show that P(X = i|X + Y = n) = 1/(n-1) , for i = 1,2,...,n-1
(1 point) If X and Y are independent and identically distributed uniform random variables on (0, 1), compute each of the following joint densities. (a) U -3X, V - 3X/Y. fu.v(u, v) - (b) U - 5X + Y, V - 3X/(X + Y)
(3) Consider a sequence of independent and identically distributed random variables such that Xk-0, with common mean EĮXk] = 1. Define the Xi, X2, ,Xp, sequence k=1 (a) Compute E[ (b) Show that (3) Consider a sequence of independent and identically distributed random variables such that Xk-0, with common mean EĮXk] = 1. Define the Xi, X2, ,Xp, sequence k=1 (a) Compute E[ (b) Show that
15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.] 15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.]
3. Let {X1, X2, X3, X4} be independent, identically distributed random variables with p.d.f. f(0) = 2. o if 0<x< 1 else Find EY] where Y = min{X1, X2, X3, X4}.
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
Problem 42.5 Let X and Y be two independent and identically distributed random variables with common density function f(x) 2x 0〈x〈1 0 otherwise Find the probability density function of X Y. 42.5 If 0 < a < l then ÍxHY(a) 2a3. If 1 < a < 2 then ÍxHY(a) -릎a3 + 4a-3. If a 〉 2 then fx+y(a) 0 and 0 otherwise.