Hint of HW2: 4. Let X = (X1, ..., Xn) be an i.i.d. sample from the...
8.60-Modified: Let X1,...,Xn be i.i.d. from an exponential distribution with the density function a. Check the assumptions, and find the Fisher information I(T) b. Find CRLB c. Find sufficient statistic for τ. d. Show that t = X1 is unbiased, and use Rao-Blackwellization to construct MVUE for τ. e. Find the MLE of r. f. What is the exact sampling distribution of the MLE? g. Use the central limit theorem to find a normal approximation to the sampling distribution h....
7. Consider a random sample X1,..., Xn from a population with a Bernoulli(@) distri- bution. (a) Suppose n > 3, show that the product W = X X X3 is an unbiased estimator of p. (b) Show that T = 2h-1X; is a sufficient statistic for 0 (c) Using your answers to parts (a) and (b), use the Rao-Blackwell Theorem to find a better unbiased estimator of 03. (Make sure you account for all cases) (d) Show that T =...
Let X1, . . . , Xn be a random sample from a population X with p.d.f fθ(x) = θ xθ−1 , for 0 < x < 1 0, otherwise, where θ > 1 is parameter. Find the MLE of 1/θ. If it is an unbiased estimator of 1/θ, compare its variance with the Cramer-Rao lower bound.
Problem 2: Let (X1,... Xn) denote a random variable from X having density fx(x) = 1/ β,0 < x < β where β > 0 is an unknown param eter. Explain why the Cramer Rao Theorem cannot be applied to show that an unbiased estimator of β is MVU. (Hint: see slides. Condition (A) of Cramer Rao Theorem)
Let X1, X2,..., Xn be a random sample from Poisson(0), 0 > 0. X. Determine the value of a constant c such that the (b) Let Y =1 -0 unbiased estimator of e. estimator eCYis an (c) Get the lower bound for the variance of the unbiased estimator found in (b)
Let X1, X2,..., Xn be a random sample from Poisson(0), 0 > 0. X. Determine the value of a constant c such that the (b) Let Y =1 -0...
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
Advanced Statistics, I need help with (c) and (d)
2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Let X1, X2, ..., Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) max(X1,X2, ...,Xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for e.
Question 3 15 marks] Let X1,..,X be independent identically distributed random variables with pdf common ) = { (#)%2-1/64 0 fx (a;e) 0 where 0 >0 is an unknown parameter X-1. Show that Y ~ T (}, ); (a) Let Y (b) Show that 1 T n =1 is an unbiased estimator of 0-1 ewhere / (0; X) is the log- likeliho od function; (c) Compute U - (d) What functions T (0) have unbiased estimators that attain the relevant...