Problem 2. Rice, Problem 7, pg. 314 (Extended)] Suppose that X1,..., Xn iid Geometric(p). a) Find...
Suppose that X1, X2, ..., Xn is an iid sample, each with probability p of being distributed as uniform over (-1/2,1/2) and with probability 1 - p of being distributed as uniform over (a) Find the cumulative distribution function (cdf) and the probability density function (pdf) of X1 (b) Find the maximum likelihood estimator (MLE) of p. c) Find another estimator of p using the method of moments (MOM)
Suppose X1, X2, , Xn is an iid sample from a uniform distribution over (θ, θΗθ!), where (a) Find the method of moments estimator of θ (b) Find the maximum likelihood estimator (MLE) of θ. (c) Is the MLE of θ a consistent estimator of θ? Explain.
Suppose that X1, X2,., Xn is an iid sample from the probability mass function (pmf) given by (1 - 0)0r, 0,1,2, 0, otherwise, where 001 (a) Find the maximum likelihood estimator of θ. (b) Find the Cramer-Rao Lower Bound (CRLB) on the variance of unbiased estimators of Eo(X). Can this lower bound be attained? (c) Find the method of moments estimator of θ. (d) Put a beta(2,3) prior distribution on θ. Find the posterior mean. Treating this as a fre-...
Suppose that X1, X2,....Xn is an iid sample of size n from a Pareto pdf of the form 0-1) otherwise, where θ > 0. (a) Find θ the method of moments (MOM) estimator for θ For what values of θ does θ exist? Why? (b) Find θ, the maximum likelihood estimator (MLE) for θ. (c) Show explicitly that the MLE depends on the sufficient statistic for this Pareto family but that the MOM estimator does not
Let X1,X2,...,Xn be iid exponential random variables with unknown mean β. (b) Find the maximum likelihood estimator of β. (c) Determine whether the maximum likelihood estimator is unbiased for β. (d) Find the mean squared error of the maximum likelihood estimator of β. (e) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (f) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (g) Determine the asymptotic distribution of the...
4.(120) Let X1,,,Xn be iid r(, 1) and g(u) given. Let 6n be the MLE of g(4) (1)(60) Find the asymptotic distribution of 6, (2)(60) Find the ARE of T Icc(X) w.r.t. on P(X1> c), c > 0 is i n i1 5.(80) Let X1, ,,Xn be iid with E(X1) = u and Var(X1) limiting distribution of nlog (1 +). o2. Find the where T n(X - 4)/s. - 1 -
4.(120) Let X1,,,Xn be iid r(, 1) and g(u)...
8(100) Let X1,,Xn be iid from r(a, 6). (1)(50) Find the limiting distribution of the MLE of B. (2)(30) Find the limiting distribution of the MLE of B when a is known. (3)(20) Compare two asymptotic variances in (1) and (2), and make comment on it. 1ラ
8(100) Let X1,,Xn be iid from r(a, 6). (1)(50) Find the limiting distribution of the MLE of B. (2)(30) Find the limiting distribution of the MLE of B when a is known. (3)(20)...
Suppose that Xi, X2, , xn is an iid sample from a U(0,0) distribution, where θ 0. În turn, the parameter 0 is best regarded as a random variable with a Pareto(a, b) distribution, that is, bab 0, otherwise, where a 〉 0 and b 〉 0 are known. (a) Turn the "Bayesian crank" to find the posterior distribution of θ. I would probably start by working with a sufficient statistic (b) Find the posterior mean and use this as...
Let X1 , . . . , xn be n iid. random variables with distribution N (θ, θ) for some unknown θ > 0. In the last homework, you have computed the maximum likelihood estimator θ for θ in terms of the sample averages of the linear and quadratic means, i.e. Xn and X,and applied the CLT and delta method to find its asymptotic variance. In this problem, you will compute the asymptotic variance of θ via the Fisher Information....
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...