Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0 ≤ x ≤ 1), θ > 0.
(a) Is − log(X1) unbiased for θ^(−1)?
(b) Find a better estimator than log(X1) in the sense of with smaller MSE.
(c) Is your estimator in part (b) UMVUE? Explain.
Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0...
Suppose that X1, X2, ,Xn is an iid sample from Íx (x10), where θ Ε Θ. In each case below, find (i) the method of moments estimator of θ, (ii) the maximum likelihood estimator of θ, and (iii) the uniformly minimum variance unbiased estimator (UMVUE) of T(9) 0. exp fx (x10) 1(0 < x < 20), Θ-10 : θ 0}, τ(0) arbitrary, differentiable 20 (d) n-1 (sample size of n-1 only) ー29 In part (d), comment on whether the UMVUE...
, xn is an iid sample from fx(x10)-θe-8z1(x > 0), where θ > 0. Suppose X1, X2, For n 2 2, n- is the uniformly minimum variance unbiased estimator (UMVUE) of 0 (d) For this part only, suppose that n-1. If T(Xi) is an unbiased estimator of e, show that Pe(T(X) 0)>0
Suppose thatX1,...Xn are IID with pdf f(x;θ) = 1 /2θ if -θ<x<θ otherwise =0 (a) Find an unbiased estimator of θ. You must prove that your estimator is unbiased. (b) Find the variance of the estimator in (a).
Let X1, X2, ..., Xn be iid with pdf f(x|θ) = θ*x(θ-1). a) Find the Maximum Likelihood Estimator of θ, and b) show that its variance converges to 0 as n approaches infinity. I have no problem with part a, finding the MLE of θ. However, I'm having some trouble with finding the variance. The professor walked us through part b generally, but I need help with univariate transformation for sigma(-ln(xi)) (see picture below - the professor used Y=sigma(-ln(x)), and...
Suppose X1, X2, , xn is an iid sample from fx(x10)-θe_&z1 (a) For n 2 2, show that (x > 0), where θ > 0 . n- is the uniformly minimum variance unbiased estimator (UMVUE) of θ (b) Calculate varo(0). Comment, in particular, on the n 2 case. (c) Show that vars(0) does not attain the Cramer-Rao Lower Bound (CRLB) on the variance of all unbiased estimators of T(9-0 (d) For this part only, suppose that n 1, 11T(X) is...
Suppose that Xi, X2, ..., Xn is an iid sample from where θ > 0. (a) Show that is a complete and sufficient statistic for σ (b) Prove that Y1-X11 follows an exponential distribution with mean σ (c) Find the uniformly minimum variance unbiased estimator (UMVUE) of T(o-o", where r is a fixed constant larger than 0.
Suppose X1, X2, ..., Xn is an iid sample from fx(r ja-θ(1-z)0-11(0 1), where x θ>0. (a) Find the method of moments (MOM) estimator of θ. (b) Find the maximum likelihood estimator (MLE) of θ (c) Find the MLE of Po(X 1/2) d) Is there a function of θ, say T 0), for which there exists an unbiased estimator whose variance attains the Cramér-Rao Lower Bound? If so, find it and identify the corresponding estimator. If not, show why not.
Suppose that X1, X2, .., Xn are iid Poisson observations, each having common pdf 0 e-8 0, otherwise. Find the UMVUE of τ(0)-g2.
Suppose X1, X2, , Xn is an iid sample from a uniform distribution over (θ, θΗθ!), where (a) Find the method of moments estimator of θ (b) Find the maximum likelihood estimator (MLE) of θ. (c) Is the MLE of θ a consistent estimator of θ? Explain.
Suppose X1, X2, . . . , Xn are a random sample from a Uniform(0, θ) distribution, where θ > 0. Consider two different estimators of θ: R1 = 2X¯ R2 =(n + 1)/n max(X1, . . . , Xn) (a) For each of the estimators R1 and R2, assess whether it is an unbiased estimator of θ. (b) Compute the variances of R1 and R2. Under what conditions will R2 have a smaller variance than R1?