Problem 5 Let Xi, X2, ..., Xn be a random sample from Bernoulli(p), 0 < p...
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.
Let {x1, x2, ..., xn} be a sample from Bernoulli(p). Find an unbiased estimator for p^2 . Let {x1,x2,..., Xn} be a ..., Xn} be a sample from Bernoulli(p). Find an unbiased estimator for p?.
Problem:2 Let Xi, X2,... , Xn be a random sample from Bernoulli(p) and consider es- timators iand p2- i. Compute the mean square error (MSEp) for both estimators p? and p2 Note that you must show the details of the calculation to receive full credit. ii. Use R to plot MSE, for both estimators using sample sizes n 20 and n- 300. Comment on the plots. iii. Use R to simulate 10,000 different Bernoulli samples of n 300 with success...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Let X1, X2, .., Xn be a random sample from Binomial(1,p) (i.e. n Bernoulli trials). Thus, п Y- ΣΧ i=1 is Binomial (n,p). a. Show that X = ± i is an unbiased estimator of p. Р(1-р) b. Show that Var(X) X(1-X (п —. c. Show that E P(1-р) d. Find the value of c so that cX(1-X) is an unbiased estimator of Var(X): п
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
1. Let Xi, X2,.., Xn be a random sample drawn from some population with mean μ--2λ and variance σ2-4, where λ is a parameter. Define 2n We use V, to estimate λ. (a) Show that is an unbiased estimator for λ. (b) Let ơin be the variance of V,, . Show that lin ơi,- 1. Let Xi, X2,.., Xn be a random sample drawn from some population with mean μ--2λ and variance σ2-4, where λ is a parameter. Define 2n...
Let X1, X2,..., Xn be a random sample from Poisson(0), 0 > 0. X. Determine the value of a constant c such that the (b) Let Y =1 -0 unbiased estimator of e. estimator eCYis an (c) Get the lower bound for the variance of the unbiased estimator found in (b) Let X1, X2,..., Xn be a random sample from Poisson(0), 0 > 0. X. Determine the value of a constant c such that the (b) Let Y =1 -0...
Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of (b) Find the uniformly most powerful (UMP) test of versuS where θο is known. (c) Derive an expression for the power function of the test in part (b) Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of...