6. (a) Given that X and Y are continuous random variables, prove from first principles that:...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
a. Suppose X and Y are continuous random variables with joint denisty f(x,y). Prove that the density of X+Y is given by: Use part (a) to show that if X,Y are independent and standard Gauss-ian (i.e.N(0,1)) then X+Yi s centered Gaussian with variance 2 that is N(0,2). fx+r(t) = { $(8,6 – u)dt
. Let X and Y be random variables. The conditional variance of Y given X, denoted Var(Y | X), is defined as Var(Y | X) = E[Y 2 | X] − E[Y | X] 2 . Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality you are showing is known as the Law of Total Variance). Hint: From the Law of Total Expectation, you get Var(Y ) = E[Y 2 ] − E[Y ] 2...
Given below is a bivariate distribution for the random variables x and y. f(x, y) x y 0.3 50 80 0.2 30 50 0.5 40 60 (a) Compute the expected value and the variance for x and y. E(x) = E(y) = Var(x) = Var(y) = (b) Develop a probability distribution for x + y. x + y f(x + y) 130 80 100 (c) Using the result of part (b), compute E(x + y) and Var(x + y). E(x...
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) = E(X)+ E(Y). (2) Prove Var(X + Y) = Var(X) + Var(Y)2Cov(X, Y). (3) Prove Cov(X, Y) E(XY)- E(X)E(Y). (4) Prove that if X and Y are independent, i.e., f(x, y) Cov(X, Y) 0. Is the reverse true? (5) Prove Cov (aX b,cY + d) = acCov(X, Y). (6) Prove Cov(X, X) = Var(X) fx (x)fy(y) for any (x,y), then =
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
I. Let X be a random sample from an exponential distribution with unknown rate parameter θ and p.d.f (a) Find the probability of X> 2. (b) Find the moment generating function of X, its mean and variance. (c) Show that if X1 and X2 are two independent random variables with exponential distribution with rate parameter θ, then Y = X1 + 2 is a random variable with a gamma distribution and determine its parameters (you can use the moment generating...
Let X and Y be jointly continuous random variables with joint probability density given by f(x, y) = 12/5(2x − x2 − xy) for 0 < x < 1, 0 < y < 1 0 otherwise (a) Find the marginal densities for X and Y . (b) Find the conditional density for X given Y = y and the conditional density for Y given X = x. (c) Compute the probability P(1/2 < X < 1|Y =1/4). (d) Determine whether...
Let X and Y be jointly continuous random variables with joint probability density given by f(x, y) = 12/5(2x − x2 − xy) for 0 < x < 1, 0 < y < 1 0 otherwise (a) Find the marginal densities for X and Y . (b) Find the conditional density for X given Y = y and the conditional density for Y given X = x. (c) Compute the probability P(1/2 < X < 1|Y =1/4). (d) Determine whether...
Let X and Y be independent normal random variables with parameters E[X] =ux, E[Y] = uy and Var(X) = x, Var(Y) = Oy. Indicate whether each of the following statements is true or false. Notation: fx,y (x, y), fx(x), fy (v) denote the joint and marginal PDFs of X and Y , respectively; $(x) is the CDF of a standard normal random variable with zero mean and unit variance. E[XY]=0