For a model ?? = ? + ??, if we assume ?(??) = 0, what is the estimator of ? using the method of moments?
Is the estimator from the method of moments unbiased? Show your proof.
3. Consider a random sample Yı, ,Yn from a Uniform[0, θ]. In class we discussed the method of ,y,). We moment estimator θ-2Y and the maximum likelihood estimator θ-maxx,Yo, derived the Bias and MSE for both estimators. With the intent to correct the bias of the mle θ we proposed the following new estimator -Imax where the subscript u stands for "unbiased." (a) Find the MSE of (b) Compare the MSE of θυ to the MSE of θ, the original...
7. When we impose a restriction on the OLS estimation that the intercept estimator is zero, we call it regression through the origin. Consider a population model Y- Au + βίχ + u and we estimate an OLS regression model through the origin: Y-β¡XHi (note that the true intercept parameter Bo is not necessarily zero). (i) Under assumptions SLR.1-SLR.4, either use the method of moments or minimize the SSR to show that the βί-1-- ie1 (2) Find E(%) in terms...
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...
Suppose X1, X2, ..., Xn is an iid sample from fx(r ja-θ(1-z)0-11(0 1), where x θ>0. (a) Find the method of moments (MOM) estimator of θ. (b) Find the maximum likelihood estimator (MLE) of θ (c) Find the MLE of Po(X 1/2) d) Is there a function of θ, say T 0), for which there exists an unbiased estimator whose variance attains the Cramér-Rao Lower Bound? If so, find it and identify the corresponding estimator. If not, show why not.
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
Consider the following model: ?j = ?0 + ?1?1j + ?j ----(1) a)Show if the estimator ?̂1_hat is an unbiased estimator for ?1.
2. (Discrete uniform). Consider the PMF P(X x)= for x 1,2,...0 _ You have a random sample of size three from this distribution: {2,3,10}. a. Find the method of moments estimate for 0 HINT: a very useful fact is that k1 n(n+1) 2 b. Find the MLE for 0 c. Which estimator is unbiased? d. Which estimator is preferred? 2. (Discrete uniform). Consider the PMF P(X x)= for x 1,2,...0 _ You have a random sample of size three from...
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
10. Let Y1,..., Y, be a random sample from a distribution with pdf 0<y< elsewhere f(x) = { $(0 –» a) Find E(Y). b) Find the method of moments estimator for 8. c) Let X be an estimator of 8. Is it an unbiased estimator? Find the mean square error of X. Show work