Suppose that X1, X2, .., Xn are iid Poisson observations, each having common pdf 0 e-8...
Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0 ≤ x ≤ 1), θ > 0. (a) Is − log(X1) unbiased for θ^(−1)? (b) Find a better estimator than log(X1) in the sense of with smaller MSE. (c) Is your estimator in part (b) UMVUE? Explain.
6.4.3. Let X1, X2, ..., Xn be iid, each with the distribution having pdf f(x; 01, 02) = (1/02)e-(2–01)/02, 01 < x <ao, -20 < 02 < 0o, zero elsewhere. Find the maximum likelihood estimators of 01 and 02.
Suppose that X1, X2, ,Xn is an iid sample from Íx (x10), where θ Ε Θ. In each case below, find (i) the method of moments estimator of θ, (ii) the maximum likelihood estimator of θ, and (iii) the uniformly minimum variance unbiased estimator (UMVUE) of T(9) 0. exp fx (x10) 1(0 < x < 20), Θ-10 : θ 0}, τ(0) arbitrary, differentiable 20 (d) n-1 (sample size of n-1 only) ー29 In part (d), comment on whether the UMVUE...
Suppose X1, X2, ..., Xn are independent and identically distributed (iid) with a Uniform -0,0 distri- bution for some unknown e > 0, i.e., the Xi's have pdf Suppose X1, X2,..., Xn are independent and identically distributed (iid f(3) = S 20, if –0 < x < 0; 20 0, otherwise. (a) (4 pts) Briefly explain why or why not this is an exponential family (b) (5 pts) Find one meaningful sufficient statistic for 0. (By "meaningful”, I mean it...
Solve the following two parts: (Hint: Use Complete Sufficient Statistic) Suppose X1 , X2, of λ2 and the UMVUE of (-1)(-1) Suppose X1 , X2, and UMVUE of 1/g? a. , Xn are iid Poisson distribution with mean λ. Find the UMVUE b. , Xn are iid Uniform[0, θ]. Assumen 3. Find UMVUE of θ3
Suppose that X1, X2,....Xn ~ iid Poisson (). Define two estamtors for . a) Show b) Show the variances of the estimators. Provide the relative efficiency (the fraction of two MSEs) of and draw your conculsion)
Let X1, · · · ,Xn be iid from Uniform(−θ,θ), where θ > 0. Let X(1) < X(2) < ... < X(n) denotes the order statistics. (a) Find a minimal sufficient statistics for θ (d) Find the UMVUE for θ. (e) Find the UMVUE for τ(θ) = P(X1 > k).
Suppose that X1, X2, ..., Xn is an iid sample, each with probability p of being distributed as uniform over (-1/2,1/2) and with probability 1 - p of being distributed as uniform over (a) Find the cumulative distribution function (cdf) and the probability density function (pdf) of X1 (b) Find the maximum likelihood estimator (MLE) of p. c) Find another estimator of p using the method of moments (MOM)
Suppose that X1, X2, ..., Xn is an iid sample from the probability density function (pdf) given by where β > 0 is unknown and m is a known constant larger than 1. (a) Show that T-T(X)-Σ-i Xi is a complete and sufficient statistic for Ux(z|β) : β 〉 0} (b) Show that c) For t > 0, show that the conditional density of Xı, given T- t, is 「( mn 1_21) m(n-1)-1 (d) Show that m- 1 mn -...
Again, let X1,..., Xn be iid observations from the Uniform(0,0) distribution. (a) Find the joint pdf of Xi) and X(n)- (b) Define R = X(n)-X(1) as the sample range. Find the pdf of R. (c) It turns out, if Xi, . . . , xn (iid) Uniform(0,0), E(R)-θ . What happens to E® as n increases? Briefly explain in words why this makes sense intuitively.