Let X1, . . . , Xn be a sample taken from the Gamma distribution Γ(2, θ−1) with pdf f(x,θ)= θ^2xexp(−θx) if x ≥ 0, θ ∈ (0,∞), and 0 otherwise,
(A) Show that Y = ∑ni=1 Xi is a complete and sufficient statistic.
(B) Find E(1/Y) . Hint: If W ∼ χ2(k) then E(W^m) = 2mΓ(k/2+m) for m > −k/2. Note also that Y Γ(k/2) Γ(n) = (n − 1)!, n ∈ N∗ . Facts from 1(C) are useful:
Hint: some facts: 1.Xi2 ∼ χ2(1). 2. If Yi ∼ Γ(α,β) and they are independent then ∑ni=1 Yi ∼ Γ(nα,β). 3. IfY ∼ Γ(α,β) then λY ∼ Γ(α,λβ) for λ > 0. 4. χ2(k) = Γ(k/2,2)
(C) Find the minimum variance unbiased estimator and show that it is not efficient.
Let X1, . . . , Xn be a sample taken from the Gamma distribution Γ(2,...
Exercise: Let Yİ,Y2, ,, be a random sample from a Gamma distribution with parameters and β. Assume α > 0 is known. a. Find the Maximum Likelihood Estimator for β. b. Show that the MLE is consistent for β. c. Find a sufficient statistic for β. d. Find a minimum variance unbiased estimator of β. e. Find a uniformly most powerful test for HO : β-2 vs. HA : β > 2. (Assume P(Type!Error)- 0.05, n 10 and a -...
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).
Let the random sample X1, . . . , Xn be taken from the Binomial distribution with parameter θ, which is unknown and must be estimated. Let the prior distribution of θ be the beta distribution with known parameters α > 0 and β > 0. Find the Bayes risk and the Bayes estimator using squared error loss. estimator of θ.
Let X1 ,……, Xn be a random sample from a Gamma(α,β) distribution, α> 0; β> 0. Show that T = (∑n i=1 Xi, ∏ n i=1 Xi) is complete and sufficient for (α, β).
2. Let X1, X2,. ., Xn be a random sample from a uniform distribution on the interval (0-1,0+1). . Find the method of moment estimator of θ. Is your estimator an unbiased estimator of θ? . Given the following n 5 observations of X, give a point estimate of θ: 6.61 7.70 6.98 8.36 7.26
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
2. Let Yı, ..., Yn be a random sample from an Exponential distribution with density function e-, y > 0. Let Y(1) minimum(Yi, , Yn). (a) Find the CDE of Y) b) Find the PDF of Y (c) Is θ-Yu) is an unbiased estimator of θ? Show your work. (d) what modification can be made to θ so it's unbiased? Explain.
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
Suppose X and Y are independent and Prove the following a) U=X+Y~gamma(α + β,γ) b) V=X/(X + Y ) ∼ beta(α,β) c) U, V independent d) ~gamma(1/2, 1/2) when W~N(0,1) X ~ gammala, y) and Y ~ gamma(6, 7) We were unable to transcribe this image