2) 6. Let Xi, , xn be i.i.d. Ņ(μ, σ (a) Find the sample analogue estimator of θ (b) Find the ML estimator of θ. 2)...
6. Let X1, , Xn be i.i.d. N(u,a2) (a) Find the sample analogue estimator of 0. (b) Find the ML estimator of 0
7. Let X1, · · · , Xn be i.i.d. with the density p(x, θ) = θ k (1 − θ) 1−k I{x = 0, 1} (a) Find the ML estimator of θ. (b) Is it unbiased ? (c) Compute its MSE 7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1) (b) Is it unbiased? (c) Compute its MSE 7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1)...
6. Let Xi,.Xn be a random sample from the pdf Find the method of moments estimator of θ.
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...
3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics. 3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics.
Let Xi,... ,Xn be i.i.d with pdf θνθ θ+1 where I(.) denotes the indicator function. (a) Find a 2-dimensional sufficient statistic for the mode (b) Suppose θ is a known constant. Find the MLE for v. (d) Suppose v-1. Find the MLE for and determine its asymptotic distribution. Carefully justify your answer and state any theorems that you use. (e) Suppose1. Find the asymptotic distribution of the MLE estimator of exp[- Let Xi,... ,Xn be i.i.d with pdf θνθ θ+1...
2. Let Xi,..., Xn be a random sample from the pd f (a) Find the method of moments estimator of θ. (b) Find the maximum likelihood estimator of θ.
Suppose Xi, X2, ,Xn is an iid N(μ, c2μ2 sample, where c2 is known. Let μ and μ denote the method of moments and maximum likelihood estimators of μ, respectively. (a) Show that ~ X and μ where ma = n-1 Σηι X? is the second sample (uncentered) moment. (b) Prove that both estimators μ and μ are consistent estimators. (c) Show that v n(μ-μ)-> N(0, σ ) and yM(^-μ)-+ N(0, σ ). Calculate σ and σ . Which estimator...
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ, ie. I. What is the Maximum Likelihood estimate θ of θ? 2. Show that the maximum likelihood estimator of θ is unbiased. 3. We're looking to cstimate the variance θ (1-9) of Xi . x being the empirical average 2(1-2). Check that T is not unli ator propose an unbiased estimator of θ(1-0).