True or False, explain your answers
(d) (5pts) For two random variables X
and Y suppose E[XY ] = 6 and
E[X] = −2. Then
we have E[Y ] = −3 if X and
Y are negatively correlated. (T, F)
For two random variables X and Y suppose E[XY ] = 6 and E[X] =
−2. Then
we have E[Y ] = −3 if X and Y are negatively correlated. The
statement is "false".
Explanation: covariance [X, Y] = E[X, Y] - E[X] E[Y]
= 6 - [-2][-3]
= 6-6
=0
This means there is zero Covariance. This implies they are Uncorrelated.
True or False, explain your answers (d) (5pts) For two random variables X and Y suppose...
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) = E(X)+ E(Y). (2) Prove Var(X + Y) = Var(X) + Var(Y)2Cov(X, Y). (3) Prove Cov(X, Y) E(XY)- E(X)E(Y). (4) Prove that if X and Y are independent, i.e., f(x, y) Cov(X, Y) 0. Is the reverse true? (5) Prove Cov (aX b,cY + d) = acCov(X, Y). (6) Prove Cov(X, X) = Var(X) fx (x)fy(y) for any (x,y), then =
True or False (a) If X ∩ Y = ∅ then the two events X and Y are independent? (b) If event X is independent of event Y, then X^c is independent of Y? (c) For a discrete random variable X, we have limx->∞ pX(x) = 0? (d) For a continuous random variable X, we have limx->∞ fX(x) = 0? (e) For a continuous random variable X, we have limx->0 fX(x) ≤ 1? (f) For two discrete random variables X...
5. Suppose we have two random variables X and Y. They are discrete and have the exact same distribution and also independent. You see below the distribution of X which of course also the distribution of Y as well, that is what we called independent and identically distributed) P(X =- X. Remem- a./ (-) Find and draw the cumulative distribution function F() function of ber that F(x) -P(X S) HINT: For the next 3 parts you might want to make...
4. Suppose X and Y are independent random variables with the same probability distribution, given by the cumulative distribution function if t 2 1 if t < 1 F(t)= 1 -t-3 (a) (10 points) Compute E(X). (b)(10 points) Compute E(XY). Chr
If two random variables X and Y are independent, are they also un-correlated ? Separately, if X and Y are un-correlated are they also independent ? When is the second statement always true ? 5. If two random variables X and Y are independent, are they also un-correlated? Separately, if X and Y are un-correlated are they also independent? When is the second statement always true?
Suppose X and Y are jointly continuous random variables with probability density function f(х+ у)={1/6(x + y), 0 < х < 1, 0 < у < 3; 0 , else} a) Find E[XY]. b) Are X and Y independent? Justify your answer citing an appropriate theorem.
2. Suppose X and Y are continuous random variables with joint density function f(x, y) = 1x2 ye-xy for 1 < x < 2 and 0 < y < oo otherwise a. Calculate the (marginal) densities of X and Y. b. Calculate E[X] and E[Y]. c. Calculate Cov(X,Y).
True or False With explanation please. 1- True or falso: a. The expectation of a random variable uniformly distributed over (a, b) is equal to (6-a) b. If the random variable X is applied to the input of a Half-wave rectifier, So the output is x>0, xs0., th cterized as r=g(X): g(x)-10. x, en - X) If a and b are constants and X is a random variable and Y-aX+b, then f v) d. If a and b are constants...
1. Let X and Y be two random variables.Then Var(X+Y)=Var(X)+Var(Y)+2Couv(X,Y). True False 2. Let c be a constant.Then Var(c)=c^2. True False 3. Knowing that a university has the following units/campuses: A, B , the medical school in another City. You are interested to know on average how many hours per week the university students spend doing homework. You go to A campus and randomly survey students walking to classes for one day. Then,this is a random sample representing the entire...
1. Suppose X and Y are continuous random variables with joint pdf f(x,y) 4(z-xy) if = 0 < x < 1 and 0 < y < 1, and zero otherwise. (a) Find E(XY) b) Find E(X-Y) (c) Find Var(X - Y) (d) What is E(Y)?