Stuck on this problem any help would be great thank you
For further any clarification please comment and thank you.
Stuck on this problem any help would be great thank you 12. Let X and Y be two random variables and define I X Y T= SD(X...
proof that Question 2: Let X and Y be any two random variables and let a and b be any two real numbers. Show that Var(aX +bY) = a? Var(X) + b2 Var(Y) + 2 abCov(X,Y).
. Let X and Y be random variables. The conditional variance of Y given X, denoted Var(Y | X), is defined as Var(Y | X) = E[Y 2 | X] − E[Y | X] 2 . Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality you are showing is known as the Law of Total Variance). Hint: From the Law of Total Expectation, you get Var(Y ) = E[Y 2 ] − E[Y ] 2...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
a) Let X and Y be two random variables with known joint PDF Ir(x, y). Define two new random variables through the transformations W=- Determine the joint pdf fz(, w) of the random variables Z and W in terms of the joint pdf ar (r,y) b) Assume that the random variables X and Y are jointly Gaussian, both are zero mean, both have the same variance ơ2 , and additionally are statistically independent. Use this information to obtain the joint...
Let X and Y be two independent random variables such that E(X) = E(Y) = u but og and Oy are unequal. We define another random variable Z as the weighted average of the random variables X and Y, as Z = 0X + (1 - 0)Y where 0 is a scalar and 0 = 0 < 1. 1. Find the expected value of Z , E(Z), as a function of u . 2. Find in terms of Oy and...
Problem D: Suppose X1, .,X, are independent random variables. Let Y be their sum, that is Y 1Xi Find/prove the mgf of Y and find E(Y), Var(Y), and P (8 Y 9) if a) X1,.,X4 are Poisson random variables with means 5, 1,4, and 2, respectively. b) [separately from part a)] X,., X4 are Geometric random variables with p 3/4. i=1
9. Let X and Y be two random variables. Suppose that σ = 4, and σ -9. If we know that the two random variables Z-2X?Y and W = X + Y are independent, find Cov(X, Y) and ρ(X,Y). 10. Let X and Y be bivariate normal random variables with parameters μェー0, σ, 1,Hy- 1, ơv = 2, and ρ = _ .5. Find P(X + 2Y < 3) . Find Cov(X-Y, X + 2Y) 11. Let X and Y...
2. Let X and Y be two random variables with a joint distribution (discrete or continuous). Prove that Cov(X,Y)= E(XY) - E(X)E(Y). (15 points) 3. Explain in detail how we can derive the formula Var(X) = E(X) - * from the formula in Problem 2 above. (Please do not use any other method of proof.) (10 points)