Let {x1, x2, ..., xn} be a sample from
Bernoulli(p). Find an unbiased estimator for p^2 .
Qus: find the unbaised estimator of p^2 for Bernoulli distribution
Solution:
Let {x1, x2, ..., xn} be a sample from Bernoulli(p). Find an unbiased estimator for p^2...
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.
Let X1, X2, .., Xn be a random sample from Binomial(1,p) (i.e. n Bernoulli trials). Thus, п Y- ΣΧ i=1 is Binomial (n,p). a. Show that X = ± i is an unbiased estimator of p. Р(1-р) b. Show that Var(X) X(1-X (п —. c. Show that E P(1-р) d. Find the value of c so that cX(1-X) is an unbiased estimator of Var(X): п
Problem 5 Let Xi, X2, ..., Xn be a random sample from Bernoulli(p), 0 < p < 1, and 7.i. Prove that the sample proportion is an unbiased estimator of p, i.e. p,- is an unbiased estimator of p 7.ii. Derive an expression for the variance of p,n 7.iii. Prove that the sample proportion is a consistent estimator of p. 7.iv. Prove that pn(1- Pn)
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
Advanced Statistics, I need help with (c) and (d)
2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
Let X1, X2, ..., Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) max(X1,X2, ...,Xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for e.
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
2. Let X1, X2,. ., Xn be a random sample from a uniform distribution on the interval (0-1,0+1). . Find the method of moment estimator of θ. Is your estimator an unbiased estimator of θ? . Given the following n 5 observations of X, give a point estimate of θ: 6.61 7.70 6.98 8.36 7.26