1.
The likelihood equation is,
The log-likelihood function is,
The maximum likelihood estimate is,
Thus, maximum likelihood estimate of is,
2.
(By central limit theorem and mean of Bernoulli distribution is )
Since, , is unbiased estimate of .
3.
Since, T is not unbiased estimator of .
Let
Then,
Thus, T' is an unbiased estimator of .
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ,...
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
5. Let Xi, , X, (n 3) be iid Bernoulli random variables with parameter θ with 0<θ<1. Let T = Σ_iXi and 0 otherwiase. (a) Derive Eo[6(X,, X.)]. (b) Derive Ee16(X, . . . , Xn)IT = t], for t = 0, i, . . . , n.
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.
1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ
Let Xi,..., Xn be iid random variables with distribution Bern(p) (a) Is the statistic 름 Σ. ? (b) Is the statistic (Σ¡X 2? Xi an unbiased estimator of p i) an unbiased estimator of p Let Xi,..., Xn be iid random variables with distribution Bern(p) (a) Is the statistic 름 Σ. ? (b) Is the statistic (Σ¡X 2? Xi an unbiased estimator of p i) an unbiased estimator of p
Let X1 , . . . , xn be n iid. random variables with distribution N (θ, θ) for some unknown θ > 0. In the last homework, you have computed the maximum likelihood estimator θ for θ in terms of the sample averages of the linear and quadratic means, i.e. Xn and X,and applied the CLT and delta method to find its asymptotic variance. In this problem, you will compute the asymptotic variance of θ via the Fisher Information....
Let X1,... Xn i.i.d. random variable with the following riemann density: with the unknown parameter θ E Θ : (0.00) (a) Calculate the distribution function Fo of Xi (b) Let x1, .., xn be a realization of X1, Xn. What is the log-likelihood- function for the parameter θ? (c) Calculate the maximum-likelihood-estimator θ(x1, , xn) for the unknown parameter θ
Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of (b) Find the uniformly most powerful (UMP) test of versuS where θο is known. (c) Derive an expression for the power function of the test in part (b) Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of...
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...