I. Suppose that χ ~ Poisson (2) and y ~ Poisson (3) are independent random variables....
8. Let the random variables X be the sum of independent Poisson distributed random variables, i.e., X = -1 Xi, where Xi is Poisson distributed with mean 1. (a) Find the moment generating function of Xi. (b) Derive the moment generating function of X. (d) Hence, find the probability mass function of X.
8. Let the random variables X be the sum of independent Poisson distributed random variables, i.e., X = 11-1Xị, where Xi is Poisson distributed with mean li. (a) Find the moment generating function of Xį. (b) Derive the moment generating function of X. (d) Hence, find the probability mass function of X.
Problem D: Suppose X1, .,X, are independent random variables. Let Y be their sum, that is Y 1Xi Find/prove the mgf of Y and find E(Y), Var(Y), and P (8 Y 9) if a) X1,.,X4 are Poisson random variables with means 5, 1,4, and 2, respectively. b) [separately from part a)] X,., X4 are Geometric random variables with p 3/4. i=1
Problem 4 Let X and y be independent Poisson(A) and Poisson(A2) random variables, respectively. i. Write an expression for the PMF of Z -X + Y. i.e.. pz[n] for all possible n. ii. Write an expression for the conditional PMF of X given that Z-n, i.e.. pxjz[kn for all possible k. Which random variable has the same PMF, i.e., is this PMF that of a Bernoulli, binomial, Poisson, geometric, or uniform random variable (which assumes all possible values with equal...
Problem 1 (16 points). Suppose that X and Y are independent random variables and that Y follows a geometric distribution with parameter p. Assume that X takes only nonnegative integer values, and let Gx(z) be the probability generating function of X. (We make no additional assumptions about the distribution of X.) Show that P(X<Y) = Gx(1- p). Clearly indicate the step(s) in your argument that use the assumption that X and Y are independent.
Let X, Y be independent random variables where X is binomial(n = 4, p = 1/3) and Y is binomial(n = 3,p = 1/3). Find the moment-generating functions of the three random variables X, Y and X + Y . (You may look up the first two. The third follows from the first two and the behavior of moment-generating functions.) Now use the moment-generating function of X + Y to find the distribution of X + Y .
1) Suppose that three random variables, X, Y, and Z have a continuous joint probability density function f(x, y. z) elsewhere a) Determine the value of the constant b) Find the marginal joint p. d. fof X and Y, namely f(x, y) (3 Points) c) Using part b), compute the conditional probability of Z given X and Y. That is, find f (Z I x y) d) Using the result from part c), compute P(Z<0.5 x - 3 Points) 2...
Problem 4. Let X and Y be independent Geom(p) random variables. Let V - min(X, Y) and Find the joint mass function of (V, W) and show that V and W are independent
Let X, Y and Z be three independent Poisson random variable with parameters λι, λ2, and λ3, respectively. For y 0,1,2,t, calculate P(Y yX+Y+Z-t) (Hint: Determine first the probability distribution of T -X +Y + Z using the moment generating function method. Moment generating function for Poisson random variable is given in earlier lecture notes) Let X, Y and Z be three independent Poisson random variable with parameters λι, λ2, and λ3, respectively. For y 0,1,2,t, calculate P(Y yX+Y+Z-t) (Hint:...
(a)Suppose X ∼ Poisson(λ) and Y ∼ Poisson(γ) are independent, prove that X + Y ∼ Poisson(λ + γ). (b)Let X1, . . . , Xn be an iid random sample from Poisson(λ), provide a sufficient statistic for λ and justify your answer. (c)Under the setting of part (b), show λb = 1 n Pn i=1 Xi is consistent estimator of λ. (d)Use the Central Limit Theorem to find an asymptotic normal distribution for λb defined in part (c), justify...