The Assumptions for the OLS (Ordinary LEast Squares) are:
1) The regression model should be linear in parameters. This condition is fulfiled.
2) The expected value of stochastic term should be zero.This may not be always true.
3) The error term is homoskedastic i.e. the variance is same for the whole sample.
4) The correlation between x variable and error term should be zero. But this condition may not fulfiled here.
5) There should be no autocorrelation among the disturbance. This is true for the given model.
6) The error term should be normally distributed with mean zero and variance equal to 1. This condition is satisfied in the model.
7) There should be no specification biased in the model. This may or may not be true in the model as the intercept has been dropped. There should be a robust argument in favor of dropping such a crucial parameter from the model.
1. Consider a regression model Yi = x;ß +ei, i = 1,...,n. You estimate this model...
Consider the zero intercept model given by Yi = B1Xi + ei (i=1,…,n) with the ei normal, independent, with variance sigma^2. For this mode (i) find the sum of (Yi –Yi-hat). (ii) find the sum of (Yi – Yi-hat)Xi. (iii) find the estimator of the error variance, sigma^2. (iv) is the estimator of the error variance biased?
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
Consider the least-squares residuals ei-yi-yi, 1, 2, . . . , linear regression model. Find the variance of the residuals Var(e). Is the vari- ance of the residuals a constant? Discuss. n,from the simple
Consider the regression model given by: Yi = βo + β1Xi + β2Zi+ ui Suppose that an econometrician wishes to test the null hypothesis given by: Ho: β1 + β2 = 1 Use this null hypothesis to specify a restricted form of the regression model (in a form that may be estimated using an OLS estimation procedure). State the equation that you could estimate as the restricted version of this model.
Consider the regression model given by: Yi = βo + β1Xi + β2Zi+ ui Suppose that an econometrician wishes to test the null hypothesis given by: Ho: β1 + β2 = 0 Use this null hypothesis to specify a restricted form of the regression model (in a form that may be estimated using an OLS estimation procedure). State the equation that you could estimate as the restricted version of this model.
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
4. Consider the model yi-β +82i + ei. Find the OLS estimator for β.
Exercise 5 Consider a linear model with n = 2m in which Yi = Bo + Bici + Eigi = 1,..., m, and Yi = Bo + B2X1 + Ei, i = m + 1, ...,n. Here €1,..., En are i.i.d. from N(0,0), B = (Bo, B1, B2)' and o2 are unknown parameters, X1, ..., Xn are known constants with X1 + ... + Xm = Xm+1 + ... + Xn = 0. 1. Write the model in vector form...
Question 2 (10 points) You are given the following model y-put ei. Consider two alternative estimators of β, b2xvix? and b = Zy/X 1. Which estimator would you choose and why if the model satisfies all the assumptions of classical regression? Prove your results. (4 points) 2. Now suppose that var(y)-hxi, where h is a positive constant (a) Obtain the correct variance of the OLS estimator. (2 points) (b) Show that the BLU estimator is now 6. Derive its variance....