Solution:
If we know that if X and Y are independent events then the Covariance between X and Y becomes zero and we defined covariance between X and Y as
Cov(X,Y) =E(XY)-E(X)E(Y)
Cov(X,Y)=0
E(XY)-E(X)E(Y)=0
E(XY)=E(X)E(Y) this condition hold if X and Y are independent events.
so above statement is true that E(XY)=0 if X and Y are independent events.
please rate rhe answer.
Thanku.
Let X and Y be independent normal random variables with parameters E[X] =ux, E[Y] = uy...
1. Let X and Y be two jointly continuous random variables with joint CDF otherwsie a. Find the joint pdf fxy(x, y), marginal pdf (fx(x) and fy()) and cdf (Fx(x) and Fy)) b. Find the conditional pdf fxiy Cr ly c. Find the probability P(X < Y = y) d. Are X and Y independent?
Let X and Y be independent exponential random variables with pa- rameter ? = 1. Given that X and Y are independent, their joint pdf is given by the product of the individual pdfs of X and Y , that is, fX,Y(x, y) = fX(x) fY(y). The joint pdf is defined over the same set of x-values and y-values that the individual pdfs were defined for. Using this information, calculate P (X ? Y ? 2) where you can assume...
Let X, Y be independent random variables with E[X] = E[Y] = 0 and ox = Oy = 5. Then Var(2x+3Y) = 1. True False
Let X, Y be independent random variables with E[X] = E[Y] = 0 and ox = oy = 5. Then Var(2x +3Y) = 1. True False
4. Let X and Y be independent exponential random variables with pa- rameter ? 1. Given that X and Y are independent, their joint pdf is given by the product of the individual pdfs of X and Y, that is, fxy(x,y) = fx(x)fy(y) The joint pdf is defined over the same set of r-values and y-values that the individual pdfs were defined for. Using this information, calculate P(X - Y < t) where you can assume t is a positive...
If the random variables X, Y, and Z have the means ux = 3, uy = -2, and uz = 2, the variances o = 3, o = 3, o2 = 2, the covariances cov(X,Y) = -2, cov(X, Z) = -1, and cov(Y,Z) = 1, U = Y - Z, and V = X - Y +2Z. (a) Find the mean and the variance of U and V, respectively. (b) Find the covariance of U and V.
Consider two random variables, X and Y. Let E(X) and E(Y) denote the population means of X and Y respectively. Further, let Var(X) and Var(Y) denote the population variances of X and Y. Consider another random variable that is a linear combination of X and Y Z- 3X- Y What is the population variance of Z? Assume that X and Y are independent, which is to say that their covariance is zero.
2. Suppose X and Y are independent continuous random variables. Show that P(Y < X) = | Fy(x) · fx (x) dx -oo where Fy is the CDF of Y and fx is the PDF of X [hint: P[Y E A] = S.P(Y E A|X = x) · fx(x) dx]. Rewrite the above equation as an expectation of a function of X, i.e. P(Y < X) = Ex[•]. Use the above relation to compute P[Y < X] if X~Exp (2)...
(II) Multiple continuous random variables: 8.2 Let X and Y have joint density fXY(x,y) = cx^2y for x and y in the triangle defined by 0 < x < 1, 0 < y < 1, 0 < x + y < 1 and fXY(x,y) = 0 elsewhere. a. What is c? b. What are the marginals fX(x) and fY(y)? c. What are E[X], E[Y], Var[X] and Var[Y]? d. What is E[XY]? Are X and Y independent?
1. Suppose that X and Y are random variables that can only take values in the intervals 0 X 2 and 0 Y 3 2. Suppose also that the joint cumulative distribution function (cdf) of X and Y, for 0 < 2 and 03 y 3 2, is as follows: Fy). 16 [5] (a) Determine the marginal cdf Fx(x) of X and the marginal cdf Fy () of Y [5] (b) Determine the joint probability density function (pdf) f(x, y)...