The conditional variance of X, given Y, is defined by Prove the conditional variance formula, namely,...
. Let X and Y be random variables. The conditional variance of Y given X, denoted Var(Y | X), is defined as Var(Y | X) = E[Y 2 | X] − E[Y | X] 2 . Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality you are showing is known as the Law of Total Variance). Hint: From the Law of Total Expectation, you get Var(Y ) = E[Y 2 ] − E[Y ] 2...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
For random variables X and Y with finite variance, the law of total variance states that Var(X) E(Var(X|Y)) + Var(E(XTY)) variance analogue of E Write out a formula that For each n, let Varn be the conditional relates two such conditional variances as the formula for the iteration condition relates two conditional expectations
(a) If var[X o2 for each Xi (i = 1,... ,n), find the variance of X = ( Xi)/n. (b) Let the continuous random variable Y have the moment generating function My (t) i. Show that the moment generating function of Z = aY b is e*My(at) for non-zero constants a and b ii. Use the result to write down the moment generating function of W 1- 2X if X Gamma(a, B) (a) If var[X o2 for each Xi (i...
6. (a) Given that X and Y are continuous random variables, prove from first principles that: (b) The random variable X has a gamma distribution with parameters-: 3 and A-2 . Y is a related variable with conditional mean and variance of =x)= Calculate the unconditional mean and standard deviation of Y. (c) Suppose that a random variable X has a standard normal distribution, and the conditional distribution of a Poisson random variable Y, given the value ol XOx, has...
Problem 4 (Conditional Expectation and Variance). Suppose the joint distri- bution of (X, Y) is given by the following contingency (row represents x) table 20 points (x,y) 2 4 6 1 0.3 0 0.1 2 0 0.2 0 3 0.1 0 0.3 A) Compute the marginal distributions of X and Y B) Are X and Y independent? Explain. C) Find the conditional distribution of Y given X -1 D) Compute E[Y|X 1] E) Compute EY|X= 2] F Compute E[exp(X)Y|x 2
Prove the following properties using the definition of the variance and the covariance: Q1. Operations with expectation and covariances Recall that the variance of randon variable X is defined as Var(X) Ξ E [X-E(X))2], the covariance is Cov(X, ) EX E(X))Y EY) As a hint, we can prove Cov(aX + b, cY)-ac Cov(X, Y) by ac EX -E(X)HY -E(Y)ac Cov(X, Y) In a similar manner, prove the following properties using the definition of the variance and the covariance: (a) Var(X)-Cov(X,...
7. Let X a be random variable with probability density function given by -1 < x < 1 fx(x) otherwise (a) Find the mean u and variance o2 of X (b) Derive the moment generating function of X and state the values for which it is defined (c) For the value(s) at which the moment generating function found in part (b) is (are) not defined, what should the moment generating function be defined as? Justify your answer (d) Let X1,...
The moment generating function ф(t) of random variable X is defined for all values of t by et*p(x), if X is discrete e f (x)dx, if X is continus (a) Find the moment generating function of a Binomial random variable X with parameters n (the total number of trials) and p (the probability of success). (b) If X and Y are independent Binomial random variables with parameters (n1 p) and (n2, p), respectively, then what is the distribution of X...
Suppose Y is uniformly distributed on (0, 1), and that the conditional distribution of X given that Y -y is uniform on (0, y). Find E[X] and Var(X).