Let Yı, Y2, ..., Yn iid N (4,02), where the population mean and population variance o2...
iid Let Yı, Y2, ..., Yn N(u,), where the population mean y and population variance o are both unknown. Show that the Method of Moments (MOM) estimators of u and o? are given by n i =Y, Y n =1 72 = n-1 S2 (Y; -Y) n n i=1 Note: In this case, (Y, S?) is a sufficient statistic for (u, o?). The MOM estimators of u and o2 are therefore functions of a sufficient statistic.
Let Yı, Y2, ...,Yn be an iid sample from a population distribution described by the pdf fy(y|0) = (@+ 1) yº, o<y<1 for 0> - -1. (a) Find the MOM estimator of 0. (b) Find the maximum likelihood estimator (MLE) of 0. (c) Find the MLE of the population mean E(Y) = 0 +1 0 + 2 You do not need to prove that the above is true. Just find its MLE.
Let Yı,Y2, ..., Yn be iid from a population following the shifted exponential distribution with scale parameter B = 1. The pdf of the population distribution is given by fy(y\0) = y-0) = e x I(y > 0). The "shift" @ > 0 is the only unknown parameter. (a) Find L(@ly), the likelihood function of 0. (b) Find a sufficient statistic for 0 using the Factorization Theorem. (Hint: O is bounded above by y(1) min{Y1, 42, ..., .., Yn}.) (c)...
1. Let Yı,Y2,..., Yn denote a random sample from a population with mean E (-0,) and variance o2 € (0,0). Let Yn = n- Y. Recall that, by the law of large numbers, Yn is a consistent estimator of . (a) (10 points) Prove that Un="in is a consistent estimator of . (b) (5 points) Prove that Vn = Yn-n is not a consistent estimator of (c) (5 points) Suppose that, for each i, P(Y, - of ? Prove what...
7. (12 points) Let Yı,Y2, ..., Yn be a random sample from Gamma(a,b), where a = 2 and 3 is an unknown parameter. 2 (a) Find the method of moments (MOM) estimator of B. (b) Find the maximum likelihood estimator (MLE) of B. (€) Are the estimators in parts (a) and (b) MVUEs for B? Justify your answer.
Suppose Yı, Y2, ..., Yn|7 vid N(10, 7-2). The population mean Mo is known. The un- known parameter T > 0, which is the inverse of the population variance, is called the precision. The pdf of N(Mo, T-1) is given by Syl-(wl=) = Vb exp (-5(v – wo)"] Let's now derive the posterior distribution of t from the Bayesian perspective. (a) Define U = (Y; – Mo)? i=1 Show that U is a sufficient statistic for t using the Factorization...
Question 1 (20 points). Suppose that Yı, Y2, ..., Yn is an iid sample from a U(0,1) distribution. (a) Show that 6 = 27 – 1 is an unbiased estimator of 0. (b) Show that the standard error of Ôn is (c) Find an unbiased estimator of . Prove that your estimator is unbiased.
. Suppose the Y1, Y2, · · · , Yn denote a random sample from a population with Rayleigh distribution (Weibull distribution with parameters 2, θ) with density function f(y|θ) = 2y θ e −y 2/θ, θ > 0, y > 0 Consider the estimators ˆθ1 = Y(1) = min{Y1, Y2, · · · , Yn}, and ˆθ2 = 1 n Xn i=1 Y 2 i . ii) (10 points) Determine if ˆθ1 and ˆθ2 are unbiased estimators, and in...
Question 4 Let Yı: Y2, .... Yn denote a random sample and let E(Y) = u and Var(Y) = o-y, i = 1, 2, ..., n. (b) Prove that the standard error of the sample mean Y SEⓇ) =
(1 point) Let Yı, Y2, ..., Yn be a random sample from the probability density function f(yla) = |aya-2/5° f(y ) 0 <y< 5 otherwise 0 for > -1. Find an estimator for a using the method of moments.