1) [6 pts] Let Y be a Bernoulli random variable with success probability Pr (Y 1...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.
Question 3: A random variable X has a Bernoulli distribution with parameter θ є (0,1) if X {0,1} and P(X-1)-θ. Suppose that we have nd random variables y, x, following a Bernoulli(0) distribution and observed values y1,... . Jn a) Show that EIX) θ and Var[X] θ(1-0). b) Let θ = ỹ = (yit . .-+ yn)/n. Show that θ is unbiased for θ and compute its variance. c) Let θ-(yit . . . +yn + 1)/(n + 2) (this...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
6. Let Y be a continuous random variable with probability density function Oyo-1, for 0< y< k; f(y) 0, otherwise, where 0 > 1 and k > 0. (a) Show that k = 1. (b) Find E(Y) and Var(Y) in terms of 0. (c) Derive 6, the moment estimator of 0 based on a random sample Y1,...,Y. (d) Derive ô, the maximum likelihood estimator of 0 based on a random sample Y1,..., Yn. (e) A random sample of n =...
Let X1, X2, .., Xn be a random sample from Binomial(1,p) (i.e. n Bernoulli trials). Thus, п Y- ΣΧ i=1 is Binomial (n,p). a. Show that X = ± i is an unbiased estimator of p. Р(1-р) b. Show that Var(X) X(1-X (п —. c. Show that E P(1-р) d. Find the value of c so that cX(1-X) is an unbiased estimator of Var(X): п
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
Question 3 [25] , Yn denote a random sample of size n from a Let Y, Y2, population with an exponential distribution whose density is given by y > 0 if o, otherwise -E70 cumulative distribution function f(y) L ..,Y} denotes the smallest order statistics, show that Y1) = min{Y1, =nYa) 3.1 show that = nY1) is an unbiased estimator for 0. /12/ /13/ 3.2 find the mean square error for MSE(e). 2 f-llays Iat-k)-at 1-P Question 4[25] 4.1 Distinguish...
difficult…… 2and4 thanks Mathematical Statistics แ (Homework y 5) 1. Let , be a random sample fiom the densit where 0 s θ 1 . Find an unbiased estimator of Q 2. Let Xi, , x. be independent random variables having pdfAx; t) given by Show that X is a sufficient statistic for e f(xl A) =-e- . x > 0 3. Let Xi, , x,' be a random sample from exponential distribution with (a) Find sufficient statistic for λ....