Exercise 5 Consider the multiple regression model y_Xß+e. The Gauss-Markov conditions hold and also ? ~...
e. Consider the multiple regression model Y = X1?1 + X2?2 + .
The Gauss-Markov conditions hold. Show that Y0 (I ? H)Y = Y0 (I ?
H1)Y ? ?ˆ0 2X0 2 (I ? H1)Y.
e. Consider the multiple regression model Ý = XiA + X2ß2 + E. The Gauss-Markov conditions hold. show that Y'(l-H)Y-Y'(1-HJY-? (1-H,)Y
Exercise 1 Answer the following questions: a. Consider the multiple regression model y-Xe subject to a set of linear constraints of the form Cß-γ, where C is mx (k + 1) matrix. The Gauss-Markov conditions hold and also ε ~ N(0, σ21) Is it true that we can test the hypothesis C9-γ using a k¿SSRfillm d SKreducedmodel Please explain b. Refer to question (a). Let H and Hi be the hat matrices of the full and reduced model respectively. Show...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the ith observation is deleted is d'B-d'B 021. Consider a = d'Ce re C = (X'X)-1x{. ii
a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the...
3. Consider the multiple linear regression model where Xii, . .. , Xp-i.i are observed covariate values for observation i, and εί udN(0, σ2) (a) What is the interpretation of in this model? (b) Write the matrix form of the model. Label the response vector, design matrix, coefficient vecto and error vector, and specify the dimensions and elements for each. (c) Write the likelihood, log-likelihood, and 쓿 in matrix form. (d) Solve = 0 for β, the MLE of the...
Q4.. [40 points] Consider the multiple linear regression model given by y - XB -+ s, where y and e are vectors of size 8 × 1, X ls a matrix of size 8 x 3 and Disa vector of sze 3 × 1. Also, the following information are available e = 22 y -2 and XTy 3 1. [10 points) Estimate the regression coefficients in the model given above? 2. [4 points] Estimate the variance of the error term...
2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX) 0 and T1 0 0 01 0 r2 00 0 0 0 0.0 0 γΝ 0 00 Notice that as a covariance matrix, Σ is bymmetric and nonnegative definite () Derive Var (0LS|x). (ii) Let B- CY be any other linear unbiased estimator where C' is an N x K function of X. Prove Var (BIX) 2 (X-x)-1 3. An oracle...
3. Consider the multiple linear regression model iid where Xi, . . . ,Xp-1 ,i are observed covariate values for observation i, and Ei ~N(0,ơ2) (a) What is the interpretation of B1 in this model? (b) Write the matrix form of the model. Label the response vector, design matrix, coefficient vector, and error vector, and specify the dimensions and elements for each. (c) Write the likelihood, log-likelihood, and in matrix form. aB (d) Solve : 0 for β, the MLE...