c)
First, I will assume I have dataset of dependent variables Yi, and independent variables X1i, X2i, X3i,... Xki.
Then, I will fit a linear regression model to that dataset: Y=a + bX1 + Z + e, where Z is a linear combination of all the independent variables from X2 onwards: Z=cX2+dX3+... Z is therefore independent of a and b.
After the model is fitted, i.e. the parameters a, b, c, d... are determined, so that the sum of square of the errors s(a,b,c,d...) = Ʃei^2 = Ʃ(Yi-a-bX1i-Zi)^2 is minimized.
For this, I calculate the partial derivatives of s for a,b,c,d.... and set them to equal 0.
I find that ∂s/∂a = -2 Ʃ(Yi-a-bX1i-Zi). Therefore Ʃ(Yi-a-bX1i-Zi) = Ʃei = 0, and E[e]=e~= 0
∂s/∂b = -2 Ʃ X1i (Yi-a-bX1i-Zi). Therefore Ʃ X1i (Yi-a-bX1i-Zi) = Ʃ X1i ei= 0
Then, Ʃ (ei-e~)(X1i-X1~) = Ʃ (eiX1i - eiX1~ - e~X1i + e~X1~) = ƩeiX1i - ƩeiX1~ - Ʃe~X1i + Ʃe~X1~ = 0 - X1~Ʃei -Ʃ0 + Ʃ0 = -X1~0 = 0 Therefore Cov(e,X1) = 0, which is what I wanted to prove.
X1 is replacable with any of the other X:s that are all combined in Z, and repeat the above analysis. Because the regression function is symmetric for all the predictor variables, I would then find that cov(e,Xk)=0 for any k.
2) Suppose you have multiple regression set up Ynxi XnxpBpxi Sxl and f ~ N(0nx1, σ21.). P Po X(X,...
Considering multiple linear regression models, we compute the regression of Y, an n x 1 vector, on an n x (p+1) full rank matrix X. As usual, H = X(XT X)-1 XT is the hat matrix with elements hij at the ith row and jth column. The residual is e; = yi - Ýi. (a) (7 points) Let Y be an n x 1 vector with 1 as its first element and Os elsewhere. Show that the elements of the...
Considering multiple linear regression models, we compute the regression of Y, an n x 1 vector, on an n x (p+1) full rank matrix X. As usual, H = X(XT X)-1 XT is the hat matrix with elements hij at the ith row and jth column. The residual is e; = yi - Ýi. (a) (7 points) Let Y be an n x 1 vector with 1 as its first element and Os elsewhere. Show that the elements of the...
3. In the usual setup for multiple regression, e, hi, Vo) are the raw residual, leverage, and the i-th leave-one-out fitted response, respectively. The i-th deleted residual sum of square is defined as where yo: (n-1)x1, the response vector with the i-th entry deleted; Xo): (n-1) x p is the design matrix with the i-th row deleted (a) (Stat438 ONLY) Show that SSRestSSRes-_ by using the following results SS Res e2 SSRes (b) Use the result from (a), show that...
linear stat modeling & regression please , i need the solution for Q3, but i copy Q2 because you need info from Q2 in order to answer Q3. 2) Suppose you have multiple regression set up YxXBp The ridge regression estimator is given by Here, llell'-Σ.< where is a vector of Vik. a) Find the expectation and variance-covariance matrix of Bridge, when X'X is a diagonal matrix with each diagonal entry is eqal to. Com pare these variances with the...
3. In the multiple regression model shown in the previous question, which one of the following statements is incorrect: (b) The sum of squared residuals is the square of the length of the vector ü (c) The residual vector is orthogonal to each of the columns of X (d) The square of the length of y is equal to the square of the length of y plus the square of the length of û by the Pythagoras theorem In all...
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
In the context of multiple regression, define the n X n matrix M =- X(X'X)-'X'. (i) Show that M is symmetric and idempotent. (ii) Prove that m, the diagonals of the matrix M, satisfy 0 sm s 1 for t = 1, 2, ..., n. (iii) Consider the linear model y = XB + u satisfies the Gauss-Markov Assumptions. Let û be the vector of OLS residuals. Show that Eſûù' x) = oʻM (iv) Conclude that while the errors {u:...
4. We have n statistical units. For unit i, we have (xi; yi), for i-1,2,... ,n. We used the least squares line to obtain the estimated regression line у = bo +biz. (a) Show that the centroid (x, y) is a point on the least squares line, where x = (1/n) and у = (1/n) Σ¡ı yi. (Hint: E ) i-1 valuate the line at x = x. (b) In the suggested exercises, we showed that e,-0 and e-0, where...
1.Given the Multiple Linear regression model as Y-Po + β.X1 + β2X2 + β3Xs + which in matrix notation is written asy-xß +ε where -έ has a N(0,a21) distribution + + ßpXo +ε A. Show that the OLS estimator of the parameter vector B is given by B. Show that the OLS in A above is an unbiased estimator of β Hint: E(β)-β C. Show that the variance of the estimator is Var(B)-o(Xx)-1 D. What is the distribution o the...
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...