X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y )...
The moment generating function ф(t) of random variable X is defined for all values of t by et*p(x), if X is discrete e f (x)dx, if X is continus (a) Find the moment generating function of a Binomial random variable X with parameters n (the total number of trials) and p (the probability of success). (b) If X and Y are independent Binomial random variables with parameters (n1 p) and (n2, p), respectively, then what is the distribution of X...
problems binomial random, veriable has the moment generating function, y(t)=E eux 1. A nd+ 1-p)n. Show that EIX|-np and Var(X) np(1-p) using that EIX)-v(0) nd E.X2 =ψ (0). 2. Lex X be uniformly distributed over (a b). Show that ElXI 쌓 and Var(X) = (b and second moments of this random variable where the pdf of X is (x)N of a continuous randonn variable is defined as E[X"-广.nf(z)dz. )a using the first Note that the nth moment 3. Show that...
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) = E(X)+ E(Y). (2) Prove Var(X + Y) = Var(X) + Var(Y)2Cov(X, Y). (3) Prove Cov(X, Y) E(XY)- E(X)E(Y). (4) Prove that if X and Y are independent, i.e., f(x, y) Cov(X, Y) 0. Is the reverse true? (5) Prove Cov (aX b,cY + d) = acCov(X, Y). (6) Prove Cov(X, X) = Var(X) fx (x)fy(y) for any (x,y), then =
Question 3 [17 marks] The random variables X and Y are continuous, with joint pdf 0 y otherwise ce fxx (,y) a) Show that cye fr (y) otherwise and hence that c = 1. What is this pdf called? (b) Compute E (Y) and var Y; (c) Show that { > 0 fx (a) e otherwise (d) Are X and Y independent? Give reasons; (e) Show that 1 E(XIY 2 and hence show that E (XY) =. Question 3 [17...
. Let X and Y be random variables. The conditional variance of Y given X, denoted Var(Y | X), is defined as Var(Y | X) = E[Y 2 | X] − E[Y | X] 2 . Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality you are showing is known as the Law of Total Variance). Hint: From the Law of Total Expectation, you get Var(Y ) = E[Y 2 ] − E[Y ] 2...
Problems binomial random variable has the moment generating function ψ(t)-E( ur,+1-P)". Show, that EIX) np and Var(X)-np(1-P) using that EXI-v(0) and Elr_ 2. Lex X be uniformly distributed over (a b). Show that EX]- and Varm-ftT using the first and second moments of this random variable where the pdf of X is () Note that the nth i of a continuous random variable is defined as E (X%二z"f(z)dz. (z-p?expl- ]dr. ơ, Hint./ udv-w-frdu and r.e-//agu-VE. 3. Show that 4 The...
Random variables X and Y have following distributions. P(X = -1) = 3/4, P(X = 3) = 1/4 P(Y = -3) = 1/2, P(Y = 2) = 1/2 a) Using the moment generating functions for random variables above find: E[X+Y) b) Using the moment generating functions for random variables above find: Var(X+Y)
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
Random variables X and Y have following distributions: PIX = -4) = 2/3, P(X = -1) = 1/3 PſY = 2) = 1/2, P(Y = 3) = 1/2 a) (5 points) Using the moment generating functions for the random variables above find: E[X+Y] b) (5 points) Using the moment generating functions for the random variables above find: Var(X+Y)
Random variables X and Y have following distributions. PIX = -1) = 3/4, P(X = -2) - 1/4 PLY = 3) = 1/2, PIY = 2) = 1/2 a) Using the moment generating functions for random variables above find: E[X+Y] b) Using the moment generating functions for random variables above find: Var(X+Y)