Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1...
a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the ith observation is deleted is d'B-d'B 021. Consider a = d'Ce re C = (X'X)-1x{. ii a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that the change in the estimate d'3 when the...
e. Consider the multiple regression model Y = X1?1 + X2?2 + . The Gauss-Markov conditions hold. Show that Y0 (I ? H)Y = Y0 (I ? H1)Y ? ?ˆ0 2X0 2 (I ? H1)Y. e. Consider the multiple regression model Ý = XiA + X2ß2 + E. The Gauss-Markov conditions hold. show that Y'(l-H)Y-Y'(1-HJY-? (1-H,)Y
Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep MV N(0,0²V) and V +In is a diagonal matrix. a) Derive the weighted least squares estimator for B, i.e., Owls. b) Show Bwis is an unbiased estimator for B. c) Derive the variances of w ls and the OLS estimator of 8. Is the OLS estimator of still the BLUE? In one sentence, explain why or why not.
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
2. Consider a simple linear regression i ion model for a response variable Y, a single predictor variable ,i1.., n, and having Gaussian (i.e. normally distributed) errors: This model is often called "regression through the origin" since E(X) = 0 if xi = 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function Hint: The function g(x)log(x) +1-x...
Econometrics 13) Consider the classical linear regression model y = XB + E, EN(0,021) The data are collected in such a way that the X matrix is orthogonal, that is X'X = 1. We want to test the null hypothesis that Ho: B1 + B2 + ... + Bx = 0. For this particular hypothesis, the standard t-test for a single linear restriction r' B = q reduces to ki bi a) t= i=1 b) t = svk Ek=1b c)t...
Consider the following simple regression model: where the e, are independent errors with E(ed-0 and var(et)-Ơ2X? a. In this case, would an ordinary least squares regression provide you with the best b. c. linear unbiased estimates? Why or why not? What is the transformed model that would give you constant error variance? Given the following data: y = (4,3,1,0,2) and x = (1,2,1,3,4) Find the generalized least squares estimates of β1 and β2 (Do this by hand! Not with excel)
4. Consider the regression model, y1B22+ BKiK+ei -.. where errors may be heteroskedastic. Choose the most incorrect statement (a) The OLS estimators are consistent and unbiased (b) We should report the OLS estimates with the robust standard errors (c) The Gauss-Markov theorem may not apply (d) The GLS cannot be used because we do not know the error variances in practice (e) We should take care of heteroskedasticity only if homoskedasticity is rejected Consider the regression model, +BKIK+et e pet-1+...
4. Consider the linear model Y = XB+e, where e MV N(0,021). (1) Derive the formula for , the least square estimate of B, using the matrix notation (2) Show that ß is an unbiased estimate for B. (3) Derive the formula for var(), using matrix notation.