Answer:
(2)
or
(3)
(4)
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p....
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
Q1 and Q2 (please also show the steps): Q1 Prove that MSE) = Var(ë) + Bias(@?, i.e., El(Ô – 9)2) = E[(O - ECO)?] + [ECO) – 6)2. Q2 Suppose X1, X2, ..., X, are i.i.d. Bernoulli random variables with probability of success p. It is known that = is an unbiased estimator for p. n 1. Find E(2) and show that p2 is a biased estimator for p? (Hint: make use of the distribution of x. and the fact...
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.
Let X1, X2, .., Xn be a random sample from Binomial(1,p) (i.e. n Bernoulli trials). Thus, п Y- ΣΧ i=1 is Binomial (n,p). a. Show that X = ± i is an unbiased estimator of p. Р(1-р) b. Show that Var(X) X(1-X (п —. c. Show that E P(1-р) d. Find the value of c so that cX(1-X) is an unbiased estimator of Var(X): п
Let {x1, x2, ..., xn} be a sample from Bernoulli(p). Find an unbiased estimator for p^2 . Let {x1,x2,..., Xn} be a ..., Xn} be a sample from Bernoulli(p). Find an unbiased estimator for p?.
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
Problem 5 Let Xi, X2, ..., Xn be a random sample from Bernoulli(p), 0 < p < 1, and 7.i. Prove that the sample proportion is an unbiased estimator of p, i.e. p,- is an unbiased estimator of p 7.ii. Derive an expression for the variance of p,n 7.iii. Prove that the sample proportion is a consistent estimator of p. 7.iv. Prove that pn(1- Pn)
4. Suppose that X1, X2, . . . , Xn are i.i.d. random variables with density function f(x) = 0 < x < 1, > 0 a) Find a sufficient statistic for . Is the statistic minimal sufficient? b) Find the MLE for and verify that it is a function of the statistic in a) c) Find IX() and hence give the CRLB for an unbiased estimator of . pdf means probability distribution function We were unable to transcribe this...