(i) Consider the simple regression model y = β0 + β1x + u under the first four Gauss Markov assumptions. For some function g(x), for example g(x) = x2 or g(x) = log(1 + x2), define zt = g(x1). Define a slope estimator as
Show that β1 is linear and unbiased. Remember, because E(u|x) = 0, you can treat both x. and zt as nonrandom in your derivation.
(ii) Add the homoskedasticity assumption, MLR.5. Show that
(iii) Show directly that, under the Gauss-Markov assumptions, Var(r1)1 is the OLS estimator. [Hint: The Cauchy-Schwartz inequality in Appendix B implies that
notice that we can drop x from the sample covariance.]
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.