For random variables X and Y with finite variance, the law of total variance states that...
. Let X and Y be random variables. The conditional variance of Y given X, denoted Var(Y | X), is defined as Var(Y | X) = E[Y 2 | X] − E[Y | X] 2 . Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality you are showing is known as the Law of Total Variance). Hint: From the Law of Total Expectation, you get Var(Y ) = E[Y 2 ] − E[Y ] 2...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
C2.3 Let X and Y be random variables with finite variance, so that EX2o0 (i) Show that E(X) - (EX) E(X - EX)2, and hence that the variance of (ii) By considering (|XI Y)2, or otherwise, show that XY has finite expecta- (iii) Let q(t) = E(X + tY)2. Show that q(t)2 0, and by considering the roots of and EY2 < oo. X is always non-negative. tion the equation q(t) 0, deduce that
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
Consider two random variables, X and Y. Let E(X) and E(Y) denote the population means of X and Y respectively. Further, let Var(X) and Var(Y) denote the population variances of X and Y. Consider another random variable that is a linear combination of X and Y Z- 3X- Y What is the population variance of Z? Assume that X and Y are independent, which is to say that their covariance is zero.
Given below is a bivariate distribution for the random variables x and y. f(x, y) x y 0.3 50 80 0.2 30 50 0.5 40 60 (a) Compute the expected value and the variance for x and y. E(x) = E(y) = Var(x) = Var(y) = (b) Develop a probability distribution for x + y. x + y f(x + y) 130 80 100 (c) Using the result of part (b), compute E(x + y) and Var(x + y). E(x...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
2. (10 pts) Random variables X and Y have the following joint PDF: 0.1, if both 11 and 2S2 Jx( if both Is2 ad Sys; 0, otherwise. (a) Prepare neat, fully labeled sketches of xir (r) (b) Find EKİY=y] and var(X|Y-v). (c) Find E[x (d) Find var(x)using the law of conditional variances.
O RANDOM VARIABLES AND DISTRIBUTIONS Expectation and variance of a random variable Let X be a random variable with the following probability distribution: Value x of X P(X-) 0.35 0.40 0.10 0.15 10 0 10 20 Find the expectation E (X) and variance Var(X) of X. (If necessary, consult a list of formulas.) Var(x) -
6. (a) Given that X and Y are continuous random variables, prove from first principles that: (b) The random variable X has a gamma distribution with parameters-: 3 and A-2 . Y is a related variable with conditional mean and variance of =x)= Calculate the unconditional mean and standard deviation of Y. (c) Suppose that a random variable X has a standard normal distribution, and the conditional distribution of a Poisson random variable Y, given the value ol XOx, has...