We said in class that two events A and B are indep(ndent if μ(An B) 6....
Let X and Y be independent random variables, with known moment generang functions Mx(t) and My (t) and Z be such that P(Z = 1) = 1-P(Z 0) = p E (0,1). Compute the moment generating function of the random variable S- ZX (1 - Z)Y. [The distribution of S is called a mirture of the distributions of X and Y.] Your answer can be left in terms of Mx(t) and My (t) Hint: If you don't know how/where to...
The moment generating function (MGF) for a random variable X is: Mx (t) = E[e'X]. Onc useful property of moment generating functions is that they make it relatively casy to compute weighted sums of independent random variables: Z=aX+BY M26) - Mx(at)My (Bt). (A) Derive the MGF for a Poisson random variable X with parameter 1. (B) Let X be a Poisson random variable with parameter 1, as above, and let y be a Poisson random variable with parameter y. X...
(6) Suppose that X is an absolutely continuous random variable with density 1<I<2 f(3) = lo, otherwise. Find (a) the moment generating function MX(t). (b) the skewness of X (c) the kurtosis of X (7) Suppose that X, Y and Z are random variables such that p(X,Y) = 1 and p(Y,Z) = -1. What is p(X, Z)? Explain your answer. (8) Suppose that X, Y and Z are random variables such that p(X,Y) = -1 and p(Y,Z) = 0. What...
7. Let X and Y be independent Gaussian random variables with identical densities N(0,1). Compute the conditional density of the random variable of X given that the sum Z = X + Y is known (i.e., XIX + Y)
Problem 5 of 5Sum of random variables Let Mr(μ, σ2) denote the Gaussian (or normal) pdf with Inean ,, and variance σ2, namely, fx (x) = exp ( 2-2 . Let X and Y be two i.i.d. random variables distributed as Gaussian with mean 0 and variance 1. Show that Z-XY is again a Gaussian random variable but with mean 0 and variance 2. Show your full proof with integrals. 2. From above, can you derive what will be the...
4. The moment generating function of the normal distribution with parameters μ and σ2 is (t) exp ( μ1+ σ2t2 ) for -oo < t oo. Show that E X)-ψ(0)-μ and Var(X)-ψ"(0)-[ty(0)12-σ2. 5. Suppose that X1, X2, and X3 are independent random variables such that E[X]0 and ElX 1 for i-12,3. Find the value of E[LX? (2X1 X3)2] 6. Suppose that X and Y are random variables such that Var(X)-Var(Y)-2 and Cov(X, Y)- 1. Find the value of Var(3X -...
Problems binomial random variable has the moment generating function ψ(t)-E( ur,+1-P)". Show, that EIX) np and Var(X)-np(1-P) using that EXI-v(0) and Elr_ 2. Lex X be uniformly distributed over (a b). Show that EX]- and Varm-ftT using the first and second moments of this random variable where the pdf of X is () Note that the nth i of a continuous random variable is defined as E (X%二z"f(z)dz. (z-p?expl- ]dr. ơ, Hint./ udv-w-frdu and r.e-//agu-VE. 3. Show that 4 The...
12. let Mx(1) be the moment generating function of X. Show that (a) Mex+o(t) = eMx(at). (b) TX - Normal(), o?) and moment generating function of X is Mx (0) - to'p. Show that the random variable 2 - Normal(0,1) 13. IX. X X . are mutually independent normal random variables with means t o ... and variances o, o,...,0, then prove that X NOEL ?). 14. If Mx(1) be the moment generating function of X. Show that (a) log(Mx...
Suppose we have 5 independent and identically distributed random variables Xi,X2.X3,X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y -XX. The density function of Y is (a) Poisson with λ-40 (b) Gamma with α-10 and λ-8 (c) Normal with μ-40 and σ-3.162 (d) Exponential with λ = 50 (e) Normal with μ-50 and σ2-15
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...