3. Let X1, X2, . . . , Xn be independent samples of a random variable with the probability density function (PDF): fX(x) = θ(x − 1/ 2 ) + 1, 0 ≤ x ≤ 1 ,0 otherwise
where θ ∈ [−2, 2] is an unknown parameter. We define the estimator ˆθn = 12X − 6 to estimate θ.
(a) Is ˆθn an unbiased estimator of θ?
(b) Is ˆθn a consistent estimator of θ?
(c) Find the mean squared error (MSE) of ˆθn.
3. Let X1, X2, . . . , Xn be independent samples of a random variable...
Let X be a random variable with probability density function (pdf) given by fx(r0)o elsewhere where θ 0 is an unknown parameter. (a) Find the cumulative distribution function (cdf) for the random variable Y = θ and identify the distribution. Let X1,X2, . . . , Xn be a random sample of size n 〉 2 from fx (x10). (b) Find the maximum likelihood estimator, Ỗmle, for θ (c.) Find the Uniform Minimum Variance Unbiased Estimator (UMVUE), Bumvue, for 0...
PROBLEM 3 Let X1, X2, ..., Xn be a random sample from the following distribution - 5) +1 if 0 <r <1 fx(2) = 10 0. 0.w.. where @ € (-2, 2) is an unknown parameter. We define the estimate ēn as: ô, = 12X – 6 to estimate . (a) Is ên an unbiased estimator of e? (b) Is Ôn a consistent estimator of e?
Only one answer is correct Let X1 , X2, . . . , X, be independent samples from a distribution with pdf fx (x) = estimator for θ? e--(X 0). which of the following is an unbiased o x
Let X1, X2, ..., Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) max(X1,X2, ...,Xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for e.
Let X1, X2, ...,Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) = max(X1, X2, ...,xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for 0.
Let X1,..., Xn and Yi,..., Ym be two independent samples from a Poisson dis- tribution with parameter X. Let a, b be two positive numbers. Consider the following estimator for A: Y1 X1 Xn . Ym b n m (a) What condition is needed on a and b so that X is unbiased? (b) What is the MSE of A?
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
If X, X2,..., Xn constitute a random sample from the population with pdf ffx) 0 elsewhere a) ind the E(X) and hence show that X is a biased estimator of 0. What is the bias? b)What estimator based on X would be an unbiased estimator of 0? Why? nen( y1-0) y, > c Given g(y,)- show that Yı= min ( X1, X2, Х. ) is a consistent 0 otherwise estimator of the parameter 0 d) Obtain the mean of Y,....
Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0 ≤ x ≤ 1), θ > 0. (a) Is − log(X1) unbiased for θ^(−1)? (b) Find a better estimator than log(X1) in the sense of with smaller MSE. (c) Is your estimator in part (b) UMVUE? Explain.
Let X1, ..., Xn and Y1, ..., Ym be two independent samples from a Poisson dis- tribution with parameter 1. Let a, b be two positive numbers. Consider the following estimator for 1: i ,Y1 +...+Ym = a- X1 +...+Xn n т (a) What condition is needed on a and b so that û is unbiased? (b) What is the MSE of i?