Question

Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the...

Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1
µ
e
−x/µ for x > 0.
(a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator.
(b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2).
Calculate the efficiency of ˆµ2 relative to ˆµ1.
(c) Show X is consistent and sufficient.
(d) Show ˆµ2 is not consistent and not sufficient. (Hint: X1 + X2)/2 ∼ Γ(2,
β
2
))

0 0
Add a comment Improve this question Transcribed image text
Know the answer?
Add Answer to:
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is...

    X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is known; show that ˆθ = 1 n Pn i=1(Xi− µ) 2 is the MLE for σ 2 and show that it is unbiased. Exactly 6.4-2. Xi, X2, . . . , xn i d. N(μ, μ)2 is the MLE for σ2 and show that it is unbiased. r'). Assume μ is known; show that θ- n Ση! (X,-

  • Suppose X1, X2, . . . , Xn are i.i.d. Exp(µ) with the density f(x) =...

    Suppose X1, X2, . . . , Xn are i.i.d. Exp(µ) with the density f(x) = for x>0 (a) Use method of moments to find estimators for µ and µ^2 . (b) What is the log likelihood as a function of µ after observing X1 = x1, . . . , Xn = xn? (c) Find the MLEs for µ and µ^2 . Are they the same as those you find in part (a)? (d) According to the Central Limit...

  • Let X1,X2,...,Xn denote independent and identically distributed random variables with mean µ and variance 2. State...

    Let X1,X2,...,Xn denote independent and identically distributed random variables with mean µ and variance 2. State whether each of the following statements are true or false, fully justifying your answer. (a) T =(n/n-1)X is a consistent estimator of µ. (b) T = is a consistent estimator of µ (assuming n7). (c) T = is an unbiased estimator of µ. (d) T = X1X2 is an unbiased estimator of µ^2. We were unable to transcribe this imageWe were unable to transcribe...

  • Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands for independent and identically distributed.] Since...

    Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands for independent and identically distributed.] Since X1, ..., Xn all have the same distribution, they have the same expected value and variance. Let E(X1) = µ and V ar(X1) = σ 2 . Find the following in terms of µ and σ 2 . (a) E(X2 1 ). Note this is not µ 2 ! (b) E( Pn i=1 X2 i /n). (c) Now, define W by W = 1...

  • Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It...

    Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...

  • Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p....

    Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?

  • Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p....

    Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...

  • Let X1, X2, ...... Xn  be a random sample of size n from EXP() distribution , ,...

    Let X1, X2, ...... Xn  be a random sample of size n from EXP() distribution , , zero , elsewhere. Given, mean of distribution and variances and mgf a) Show that the mle for is . Is a consistent estimator for ? b)Show that Fisher information . Is mle of an efficiency estimator for ? why or why not? Justify your answer. c) what is the mle estimator of ? Is the mle of a consistent estimator for ? d) Is...

  • 5. Let X ∼ Exp(λ) with λ unknown, and suppose X1, X2 is a random sample...

    5. Let X ∼ Exp(λ) with λ unknown, and suppose X1, X2 is a random sample of size 2. Show that M = sqrt( X1 · X2 ) is a biased estimator of 1/λ and modify it to create an unbiased estimator. (Hint: During your journey, you’ll need the help of the gamma distribution, the gamma function, and the knowledge that Γ(1/2) = √ π.)

  • 2. Let X1,..., Xn be i.i.d. according to a normal distribution N(u,02). (a) Get a sufficient...

    2. Let X1,..., Xn be i.i.d. according to a normal distribution N(u,02). (a) Get a sufficient statistic for u. Show your work. (b) Find the maximum likelihood estimator for u. (c) Show that the MLE in part (b) is an unbiased estimator for u. (d) Using Basu's theorem, prove that your MLE from before and sº, the sample variance, are independent. (Hint: use W; = X1-0 and (n-1)32)

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT