i.i.d U (9.1). Find ML Find the pdf of ML. Is ML unbiased? If so, construct...
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
7. Let X1, · · · , Xn be i.i.d. with the density p(x, θ) = θ k
(1 − θ) 1−k I{x = 0, 1}
(a) Find the ML estimator of θ.
(b) Is it unbiased ?
(c) Compute its MSE
7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1) (b) Is it unbiased? (c) Compute its MSE
7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1)...
6. Let X1, , Xn be i.i.d. N(u,a2) (a) Find the sample analogue estimator of 0. (b) Find the ML estimator of 0
(a) Are they unbiased estimators for µ?
(b) Compute the MSE for all the 4 estimators.
(c) Which one is the best estimator for µ? Why.
PLEASE answer all parts, thanks
Let X1, X2, ..., X, be and i.i.d. sample from some distribution with mean y and variance o? Let us construct several estimators for . Let îi = X, iz = X1, A3 = (X1 + X2)/2, W = X1 + X2 (a) Are they unbiased estimators for ?...
2) 6. Let Xi, , xn be i.i.d. Ņ(μ, σ (a) Find the sample analogue estimator of θ (b) Find the ML estimator of θ.
2) 6. Let Xi, , xn be i.i.d. Ņ(μ, σ (a) Find the sample analogue estimator of θ (b) Find the ML estimator of θ.
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
2. Let X1,..., Xn be i.i.d. according to a normal distribution N(u,02). (a) Get a sufficient statistic for u. Show your work. (b) Find the maximum likelihood estimator for u. (c) Show that the MLE in part (b) is an unbiased estimator for u. (d) Using Basu's theorem, prove that your MLE from before and sº, the sample variance, are independent. (Hint: use W; = X1-0 and (n-1)32)
Let X1, ..., X., be i.i.d random variables N(u, 02) where u is known parameter and o2 is the unknown parameter. Let y() = 02. (i) Find the CRLB for yo?). (ii) Recall that S2 is an unbiased estimator for o2. Compare the Var(S2) to that of the CRLB for
Exercise 5 (Sample variance is unbiased). Let X1, ... , Xn be i.i.d. samples from some distribution with mean u and finite variance. Define the sample variance S2 = (n-1)-1 _, (Xi - X)2. We will show that S2 is an unbiased estimator of the population variance Var(X1). (i) Show that ) = 0. (ii) Show that [ŠX – 1908–) -0. ElCX –po*=E-* (Šx--) == "Varex). x:== X-X+08 – ) Lx - X +2Zx - XXX - 1) + X...
, X,' up N(μ, σ2), with σ2 known. Let μη-Xn + 5. Let Xi, of u be an estimator (a) Is ,hi an unbiased estimator for μ? (b) For a particular fixed n, find the distribution of (c) Find the mean squared error (MSE) of . (d) Prove that μη is consistent for μ