6. Let X1, , Xn be i.i.d. N(u,a2) (a) Find the sample analogue estimator of 0....
2) 6. Let Xi, , xn be i.i.d. Ņ(μ, σ (a) Find the sample analogue estimator of θ (b) Find the ML estimator of θ. 2) 6. Let Xi, , xn be i.i.d. Ņ(μ, σ (a) Find the sample analogue estimator of θ (b) Find the ML estimator of θ.
2. Let X1,..., Xn be i.i.d. according to a normal distribution N(u,02). (a) Get a sufficient statistic for u. Show your work. (b) Find the maximum likelihood estimator for u. (c) Show that the MLE in part (b) is an unbiased estimator for u. (d) Using Basu's theorem, prove that your MLE from before and sº, the sample variance, are independent. (Hint: use W; = X1-0 and (n-1)32)
7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE 7. Let X1 , Xn be i.i.d. with the density p(r,0) = a*(1 - 0)1-k1{x = 0,1} (a) Find the ML estimator of 0 (b) Is it unbiased? (c) Compute its MSE
Only Questions 4,5 and 6 a=5 Problem 1. Let (X1, ...., Xn) be an i.i.d random sample with X; ~ U[0, 2a), and (Y1, ..., Yn) be an i.i.d random sample with Y; ~ Exp ( 1. Find E[X;], E[X3], E[Y/] and E[Y;?). 2. Notwithstanding the actual distributions of the random samples, suppose the modeller believes that they are i.i.d draws from a U (0, 2a distribution. Find the (simple) method of moments estimator â. 3. Let n = 1000....
5. Let Xi i = 1,2, . ,N be i.i.d. U(0,1). Let Z = max{X1, .,Xn} and find Fz.
Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
6. Let X1,..., Xn be a random sample from Uniform (0, 1). a) Find the exact distribution of U = – log(X(1)) where X(1) = min(X1, X2,..., Xn). b) Find the limiting distribution of n(1 – X(n)), where X(n) = max(X1, X2, ..., Xn).
7. Let X1, · · · , Xn be i.i.d. with the density p(x, θ) = θ k (1 − θ) 1−k I{x = 0, 1} (a) Find the ML estimator of θ. (b) Is it unbiased ? (c) Compute its MSE 7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1) (b) Is it unbiased? (c) Compute its MSE 7. Let Xi, . . . , Xn be i.id, with the density p(z,0)-gk(1-0)1-k1(z-0, 1)...
Let X1....,Xn be a sample of size n from a distribution with expectation u and variance sigma^2 and let u = (2X1+X2+...+Xn-1+2Xn)/(n+1) be an estimator for u. u is consistent,asymptotically unbiased ,unbiased?