Let X1,...X be i.i.d with density f()(1/0)exp(-/0) for r >0 and 0> 0. a. Find the...
Letter f and g only. 44 Let X,..., X. be a random sample from (a) Find a sufficient statistic. (b) Find a maximum-likelihood estimator of θ. (c) Find a method-of-moments estimator of θ. (d) Is there a complete sufficient statistic? If so, find it. (e) Find the UMVUE of 0 if one exists. (f) Find the Pitman estimator for the location parameter θ. (g) Using the prior density g(0)--e-n,๑)(8), find the posterior Bayes estimator Of θ. 44 Let X,..., X....
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE 7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
44 Let X,..., X. be a random sample from Find the Pitman estimator for the location parameter (f) Using the prior density g(0)--e-n,”(θ. find the posterior Bayes estimator (g) Of θ. 44 Let X,..., X. be a random sample from Find the Pitman estimator for the location parameter (f) Using the prior density g(0)--e-n,”(θ. find the posterior Bayes estimator (g) Of θ.
Recall that the exponential distribution with parameter A > 0 has density g (x) Ae, (x > 0). We write X Exp (A) when a random variable X has this distribution. The Gamma distribution with positive parameters a (shape), B (rate) has density h (x) ox r e , (r > 0). and has expectation.We write X~ Gamma (a, B) when a random variable X has this distribution Suppose we have independent and identically distributed random variables X1,..., Xn, that...
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1 µ e −x/µ for x > 0. (a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator. (b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2). Calculate the efficiency of ˆµ2 relative to ˆµ1. (c) Show X is consistent and sufficient. (d) Show ˆµ2 is not consistent...
Let the random sample X1, . . . , Xn be taken from the Binomial distribution with parameter θ, which is unknown and must be estimated. Let the prior distribution of θ be the beta distribution with known parameters α > 0 and β > 0. Find the Bayes risk and the Bayes estimator using squared error loss. estimator of θ.
2. Let X1, X2, ...,Xbe i.i.d. Poisson with parameter .. (a) Find the maximum likelihood estimator of . Is the estimator minimum variance unbi- ased? (b) Derive the asymptotic (large-sample) distribution of the maximum likelihood estimator. (c) Suppose we are interested in the probability of a zero: Q = P(Xi = 0) = exp(-). Find the maximum likelihood estimator of O and its asymptotic distribution.
Suppose X1, X2, . . . , Xn are i.i.d. Exp(µ) with the density f(x) = for x>0 (a) Use method of moments to find estimators for µ and µ^2 . (b) What is the log likelihood as a function of µ after observing X1 = x1, . . . , Xn = xn? (c) Find the MLEs for µ and µ^2 . Are they the same as those you find in part (a)? (d) According to the Central Limit...
Please answer the following question and show every step. Thank you. Let Xi,..,Xn be a random sample from a population with pdf 0, x<0, where θ > 0 is unknown. (a) Show that the Gamma(a, b) prior with pdf 0, θ < 0. is a conjugate prior for θ (a > 0 and b > 0 are known constants). (b) Find the Bayes estimator of θ under square error loss. (c) Find the Bayes estimator of (2π-10)1/2 under square error...
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.