Please DO UPVOTE
Suppose that X, Y and Z are all independent of each other, with the following distributions:...
Question 3 (4101) Suppose that X, Y and Z are all independent of each other, with the following distributions: X Poisson (1) Y Gamma(,b) Z~ N(0,1) Define A as the sum: A = X+Y+Z a What is E[A]? b What is the MGF of A? (you don't need to re-derive the individual mgfs) c Use ma(t) to find E[A] (should match part a)
The moment generating function (MGF) for a random variable X is: Mx (t) = E[e'X]. Onc useful property of moment generating functions is that they make it relatively casy to compute weighted sums of independent random variables: Z=aX+BY M26) - Mx(at)My (Bt). (A) Derive the MGF for a Poisson random variable X with parameter 1. (B) Let X be a Poisson random variable with parameter 1, as above, and let y be a Poisson random variable with parameter y. X...
1. Suppose X ∼ Gamma(a,b) and Y ∼ Gamma(c,d). Furthermore suppose X and Y are independent. Let W = X + Y . (a) Find the MGF of W. (b) What restrictions would need to be placed on the values of a, b, c, and d in order for W to be a Gamma Random Variable. What would the parameters be?
Problem D: Suppose X1, .,X, are independent random variables. Let Y be their sum, that is Y 1Xi Find/prove the mgf of Y and find E(Y), Var(Y), and P (8 Y 9) if a) X1,.,X4 are Poisson random variables with means 5, 1,4, and 2, respectively. b) [separately from part a)] X,., X4 are Geometric random variables with p 3/4. i=1
MULTIVARIATE DISTRIBUTIONS 3. Suppose that Xi and X2 are independent and each has a uniform distribution on (0,1). Define Y: X1 + X2 and Y2 = X1-X2. Find the marginal probability density functions of Y1 and Y2. . Suppose that X has a standard normal distribution, and that the conditional distribution of Y given X is a normal distribution with mean 2X 3 and variance 12. Find E(Y) and Var(Y)
5. Suppose that X and Y are independent with distributions N(0,0) and N(0,02), respectively. Let Z=X+Y. Also, let W = 02X – oʻY. Prove that Z and W are uncorrelated.
Suppose X and Y are independent and Prove the following a) U=X+Y~gamma(α + β,γ) b) V=X/(X + Y ) ∼ beta(α,β) c) U, V independent d) ~gamma(1/2, 1/2) when W~N(0,1) X ~ gammala, y) and Y ~ gamma(6, 7) We were unable to transcribe this image
(1 point) In Unit 3, I claimed that the sum of independent, identically distributed exponential random variables is a gamma random variable. Now that we know about moment generating functions, we can prove it. Let X be exponential with mean A 4. The density is 4 a) Find the moment generating function of X, and evaluate at t 3.9 The mgf of a gamma is more tedious to find, so l'll give it to you here. Let W Gamma(n, A...
1. Let X and Y be two independent random variables following beta distributions Beta(120, 2019) (a) What's P(X 0.3)? (b) What's E(2X - Y)? (c) What's P(2X +4 > 3Y)? (d) What's P(X < Y)? (e) Now if X and Y are no longer independent to each other. Will the answers to a)-(d) remain the same? Explain. (f) Now define Z~Beta(2019, 120). Compare the median of X and Z, which one is bigger? Compare the variance of X and Z,...
7. Assume X gammala, y) and Y-gamma(8,7) are independent. (a) Show that U = X + Y gamma(a +.). (b) Show that V = X/(X+Y) beta(a. 8). (c) Show that U and V are independent. (d) Show that W = 72 gamma(1/2, 1/2) if Z N (0,1).