TOPIC:Properties of expectation,variance and covariance.
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) =...
2. Suppose X and Y are independent continuous random variables. Show that P(Y < X) = | Fy(x) · fx (x) dx -oo where Fy is the CDF of Y and fx is the PDF of X [hint: P[Y E A] = S.P(Y E A|X = x) · fx(x) dx]. Rewrite the above equation as an expectation of a function of X, i.e. P(Y < X) = Ex[•]. Use the above relation to compute P[Y < X] if X~Exp (2)...
2. Let X and Y be two random variables with a joint distribution (discrete or continuous). Prove that Cov(X,Y)= E(XY) - E(X)E(Y). (15 points) 3. Explain in detail how we can derive the formula Var(X) = E(X) - * from the formula in Problem 2 above. (Please do not use any other method of proof.) (10 points)
4. Recall that the covariance of random variables X, and Y is defined by Cov(X,Y) = E(X - Ex)(Y - EY) (a) (2pt) TRUE or FALSE (circle one). E(XY) 0 implies Cov(X, Y) = 0. (b) (4 pt) a, b, c, d are constants. Mark each correct statement ( ) Cov(aX, cY) = ac Cov(X, Y) ( ) Cor(aX + b, cY + d) = ac Cov(X, Y) + bc Cov(X, Y) + da Cov(X, Y) + bd ( )...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
Problem #5 (20 points) - Quotient of Two Random Variables Suppose that X and Y are independent positive continuous random variables with pdfs fx(x) and fy (y) and suppose that Z = X/Y. Show that the pdf of Z can be computed from the pdfs fx(x) and fy(y), using fz(2) = fx(yz)fy(y)ydy.
a. Suppose X and Y are continuous random variables with joint denisty f(x,y). Prove that the density of X+Y is given by: Use part (a) to show that if X,Y are independent and standard Gauss-ian (i.e.N(0,1)) then X+Yi s centered Gaussian with variance 2 that is N(0,2). fx+r(t) = { $(8,6 – u)dt
2. Suppose X and Y are continuous random variables with joint density function f(x, y) = 1x2 ye-xy for 1 < x < 2 and 0 < y < oo otherwise a. Calculate the (marginal) densities of X and Y. b. Calculate E[X] and E[Y]. c. Calculate Cov(X,Y).
Let X and Y be independent normal random variables with parameters E[X] =ux, E[Y] = uy and Var(X) = x, Var(Y) = Oy. Indicate whether each of the following statements is true or false. Notation: fx,y (x, y), fx(x), fy (v) denote the joint and marginal PDFs of X and Y , respectively; $(x) is the CDF of a standard normal random variable with zero mean and unit variance. E[XY]=0
Suppose two continuous random variables X and Y have cumulative distribution functions Fx(x) and Fy(y) respectively. Suppose that Fx(x) > Fy(x) for all x. Indicate whether the following statements are TRUE or FALSE with brief explanation. (a) E(X) > E(Y) (b) The probability density functions fx, fy satisfy fx(x) > fy(x) for all x. (c) P (X = 1) > P (Y = 1)
2. A continuous random variable has joint pdf f(x, y): xy 0 x 1, 0sys 2 f(x, y) otherwise 0 a) Find c b) Find P(X Y 1) b) Find fx(x) and fy(v) c) Are X and Y independent? Justify your answer d) Find Cov(X, Y) and Corr(X, Y) e) Find fxiy (xly) and fyixylx)