Question

S y, and that yi-μ +Ei. You can assume that Ele]-0 for all i, Ele: -σ2 for all i, and Ele#3-0 for all i j You want to estimate a sample mean, and your friend tells you to use the following estimator: uppose that vou have collected n observations on where w is a known sample weight for observation i (this means w; is non-random) (a) Find E( (b) Under what conditions, if any, is p an unbiased estimator? Under what conditions, if any, is ji a biased estimator? (c) Find var(μ) (d) Suppose at wi = var(s). Find var(μ) var(ei

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Ans. We assume that e​​​​​​i​​​​​ has distribution with meao zero and variance sigm​​​2​​. Then Y also has same distribution with mean mu and variance sigm​​​2​​sigm2.

lat -I 121 jai に! (d). Suppo go that wi-」-= 1

Add a comment
Know the answer?
Add Answer to:
S y, and that yi-μ +Ei. You can assume that Ele]-0 for all i, Ele: -σ2...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 4. Xi ,i = 1, , n are iid N(μ, σ2). (a) Find the MLE of...

    4. Xi ,i = 1, , n are iid N(μ, σ2). (a) Find the MLE of μ, σ2. Are these unbiased estimators of μ and of σ2 respectively? Aside: You can use your result in (b) to justify your answer for the bias part of the MLE estimator of σ2 (b) In this part you will show, despite that the sample variance is an unbiased estimator of σ2, that the sample standard deviation is is a biased estimator of σ....

  • 4. Suppose Yi Y, are id randonn variables with E(Y )-μ, Var(Y)= σ2 < o For...

    4. Suppose Yi Y, are id randonn variables with E(Y )-μ, Var(Y)= σ2 < o For large n, find the approximaate distribution of YBeure to name any theorems you used.

  • 5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" f...

    5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that is, the y's are equicorrelated. (a) Show that Σ can be written in the form Σ-σ2(I-P)1+a (b) Show that Σ-i(vi-y?/(r2(1-p] is X2(n-1) 5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that...

  • 3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for...

    3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...

  • 4. Suppose Yi, Yn are iid randonn variables with E(X) = μ, Var(y)-σ2 < oo. For...

    4. Suppose Yi, Yn are iid randonn variables with E(X) = μ, Var(y)-σ2 < oo. For large n, find the approximate distribution of p = n Σηι Yi, Be sure to name any theorems you used.

  • Consider a random vector Y () y(2). y(k) where the elements y(i) are made yi)wi), j-1,...

    Consider a random vector Y () y(2). y(k) where the elements y(i) are made yi)wi), j-1, ...k where w(j) are independent, identically distributed, Gaussian, zero-mean, and with the variance σ2 i.e., N(0, σ2). 1. Find the Maximum Likelihood (ML) estimator for xr, i.e., ML 2. Find the Mean Square Error (MSE) of ML estimator, i.e., MSE(XML) Ξ Var@sL) 3. Is this estimator consistent? Prove your answer 4. Is this estimator efficient? Prove your answer

  • Problem 2 (20 points Assume X, . , Ņ(μ, σ2). Show that S-n-ι Σί=i( An is...

    Problem 2 (20 points Assume X, . , Ņ(μ, σ2). Show that S-n-ι Σί=i( An is a random sample from the normal distribution Xi _ Λ )-Is an unbiased estimator of σ 2.

  • 1. You wish to estimate μ, the expected value of Y. You have three data points,...

    1. You wish to estimate μ, the expected value of Y. You have three data points, ~ N(μ, i), Y2 ~ N(μ, 2), and Y3 ~ N(μ, 3). (a) You estimate μ using the sample mean of your three random vari- Find the expected value of your sample mean (b) True or False (and state why): the sample mean with just three observations gives a biased estimate of μ, since it does not take into account any of the information...

  • R1. Suppose X is a continuous RV with E(X-μ and Var(X-σ2 where both μ and σ...

    R1. Suppose X is a continuous RV with E(X-μ and Var(X-σ2 where both μ and σ are unknown. Note that X may not be a normal distribution. Show that X is an asymptotically unbiased estimator for μ2. (This problem does not require the computer.) R2. Let X ~ N(μ 10.82). Following up on R1, we will be approximating μ2, which we can see should be 100, For now, let the sample size be n 3. Pick 3 random numbers from...

  • RI. Suppose X is a continuous RV with E(X)-μ and Var(X)-σ2 where both μ and σ...

    RI. Suppose X is a continuous RV with E(X)-μ and Var(X)-σ2 where both μ and σ are unknown. Note that X may not be a normal distribution. Show that X is an asymptotically unbiased estimator for μ. (This problem does not require the computer.) R2. Let X ~ ŅĢi-10.82). Following up on RI, we will be approximating μ2, which we can see should be 100. For now, let the sample size be n = 3, Pick 3 random numbers from...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT