(a)Suppose X ∼ Poisson(λ) and Y ∼ Poisson(γ) are independent, prove that X + Y ∼ Poisson(λ + γ).
(b)Let X1, . . . , Xn be an iid random sample from Poisson(λ), provide a sufficient statistic for λ and justify your answer.
(c)Under the setting of part (b), show λb = 1 n Pn i=1 Xi is consistent estimator of λ.
(d)Use the Central Limit Theorem to find an asymptotic normal distribution for λb defined in part (c), justify your answer.
(e)Suppose γ is a random variable with Exp(θ) distribution. Conditioning on γ, Y ∼ Poisson(γ). Provide the marginal mean and variance of Y .
(a)Suppose X ∼ Poisson(λ) and Y ∼ Poisson(γ) are independent, prove that X + Y ∼...
Suppose X1,. , Xn are iid Poisson(A) random variables. Show by direct calculation without using any theoremm in mathematical statistics, that (a) Ση! Xi/n is an unbiased estimator for λ. (b) X is optimal in MSE among all unbiased estimators. This is to say, let T be another unbiased estimator, then EA(X) EA(T2
5. Let X ∼ Exp(λ) with λ unknown, and suppose X1, X2 is a random sample of size 2. Show that M = sqrt( X1 · X2 ) is a biased estimator of 1/λ and modify it to create an unbiased estimator. (Hint: During your journey, you’ll need the help of the gamma distribution, the gamma function, and the knowledge that Γ(1/2) = √ π.)
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...
Let X1, ... , Xn be iid with the Poisson(λ) distribution. What is the conditional distribution of Xi given the sample mean?
Part 1: Derive the expected value and find the asymptotic distribution. Part 2: Find the consistent estimator and use the central limit theorem b. Derive the expected value of X for the Weibull(X,2) distribution. c. Suppose X,.. .X,~iid Uniffo,0). Find the asymptotic distribution of Z-n(-Xm) max Let X, X, ~İ.id. Exp(β). a. Find a consistent estimator for the second moment E(X"). Use the mgf of X to prove that your estimator is consistent in the case β=2 b. Use the...
Let Xi,... ,Xn be i.i.d with pdf θνθ θ+1 where I(.) denotes the indicator function. (a) Find a 2-dimensional sufficient statistic for the mode (b) Suppose θ is a known constant. Find the MLE for v. (d) Suppose v-1. Find the MLE for and determine its asymptotic distribution. Carefully justify your answer and state any theorems that you use. (e) Suppose1. Find the asymptotic distribution of the MLE estimator of exp[- Let Xi,... ,Xn be i.i.d with pdf θνθ θ+1...
Can someone help me with part (c), (with detailed explanation) Suppose that Xi,.. Xn are independent and identically distributed Bernoulli random variables, with mass function P (Xi = 1) = p and P (Xi = 0) = 1-p for some p (0,1) (a) For each fixed p є (0,1), apply the central limit theorem to obtain the asymptotic distribution of Σ.Xi, after appropriate centering and normalisation. (b) Derive the moment generating function of a Poisson(A) distribution. (c) Now suppose that...
Problem 3: Suppose X1, X2, is a sequence of i.i.d. random variables having the Poisson distribution with mean λ. Let A,-X, (a) Is λη an unbiased estimator of λ? Explain your answer. (b) Is in a consistent estimator of A? Explain your answer 72
Please let me know how to solve 7.6.5. 6.5. Let Xi, X2,. .. X, be a random sample from a Poisson distribution with parameter θ > 0. (a) Find the MVUE of P(X < 1)-(1 +0)c". Hint: Let u(x)-1, where Y = Σ1Xi. 1, zero elsewhere, and find Elu(Xi)|Y = y, xỉ (b) Express the MVUE as a function of the mle of θ. (c) Determine the asymptotic distribution of the mle of θ (d) Obtain the mle of P(X...
2. Let X1, X2, ...,Xbe i.i.d. Poisson with parameter .. (a) Find the maximum likelihood estimator of . Is the estimator minimum variance unbi- ased? (b) Derive the asymptotic (large-sample) distribution of the maximum likelihood estimator. (c) Suppose we are interested in the probability of a zero: Q = P(Xi = 0) = exp(-). Find the maximum likelihood estimator of O and its asymptotic distribution.