6. Consider the following regression model without an intercept: Y = B,X, +U, One possible estimator...
Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2
7. When we impose a restriction on the OLS estimation that the intercept estimator is zero, we call it regression through the origin. Consider a population model Y- Au + βίχ + u and we estimate an OLS regression model through the origin: Y-β¡XHi (note that the true intercept parameter Bo is not necessarily zero). (i) Under assumptions SLR.1-SLR.4, either use the method of moments or minimize the SSR to show that the βί-1-- ie1 (2) Find E(%) in terms...
Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator bi of β1- (b) Is this estimator unbiased? Prove your result
Consider the model y = a + bX + e. Show that the least squares estimator for b is unbiased and consistent. You can assume that the 5 standard disturbance term assumptions are true. For each step explain why it is true.
Consider the fitted values that result from performing simple linear regression without an intercept, i.e., the model is Y = βX + error. (a) By minimizing the RSS, find the estimated coefficient βˆ (the least square estimator). (b) Show that the least square estimater is unbiased, i.e., E(βˆ) = β (c) (5 points) What is the variance of the estimator? i.e., find V ar(βˆ).
012. (a) The ordinary least squares estimate of B in the classical linear regression model Yi = α + AXi + Ui ; i=1,2, , n and xi = Xi-K, X-n2Xī i- 1 Show that if Var(B-.--u , no other linear unbiased estimator of β n im1 can be constructed with a smaller variance. (All symbols have their usual meaning) 18
Consider the regression model y=β0+β1x1+β2x2+u Suppose this is estimated by Feasible Weighted Least Squares (FWLS) assuming a conditional variance function Varux=σ2h(x). Which of the following statements is correct? A) The function h(x) does not need to be estimated as part of the procedure B) If the assumption about the conditional variance of the error term is incorrect, then FWLS is still consistent. C) FWLS is the best linear unbiased estimator when there is heteroscedasticity. D) None of the above answers...
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
Consider the following slope estimator: b=2i=1 Yi Suppose the true model is ki + Bo + Bicite and the model satisfies the Gauss-Markov conditions. Answer the following questions: (a) What assumption in addition to the Gauss-Markov assumptions is required to estimate the model? (b) Show that in general, b is a biased estimator of B1. (c) Outline the special condition(s) under which b is an unbiased estimator of B1.
Problem 2. (Regression without intercept, 50 pts) Suppose you are given the model: Y; = BX; + Ui, E[u;|Xį] = 0. A) Derive the OLS estimator ß. B) After you estimate B, you can obtain the residual û; = Y; – ĢXį. Does 21-1 Ûi = 0? Explain why and show your derivation.