Question

II. Derivations (You must show all your work for full credit.)i. Given the model y=XB+ɛ, derive the least squares estimate for ß? (10 points) ii. Show that B=(x+x)xy is an unbiased esti

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Let

X be a nXk matrix

Y,E and \beta be nX1 vectors

X' is the transpose of X

\beta ^ is the estimate for \beta

Now,

1596222182318_image.png

or Y= X\beta+E

i. Using principle of ordinary least squares we minimise the sum of squared residuals :

E=Y- X\beta (Residuals)

Sum of squared residuals: E'E=(Y- X\beta^)' (Y- X\beta^)= Y'Y-\beta^'X'Y-Y'X\beta^+\beta^'X'X\beta^ = Y'Y-2\beta^'X'Y+\beta^'X'X\beta^

To minimise, we partially take the derivative with \beta ^

1596222561402_image.png

Now,we get the normal equation as

(X'X)\beta^=X'Y

or \beta ^=((X'X)^-1)*X'Y ---1

ii.

Taking expectations on both sides of equation 1

E( \beta ^)=E(((X'X)^-1)*X'Y)

=((X'X)^-1)X' E(Y)= ((X'X)^-1)X' (X\beta+E)= ((X'X)^-1)X'X\beta= \beta (E(E)=0)

Hence E( \beta ^)= \beta

Therefore unbiasedness is proved

iii. Variance \beta ^ = E(( \beta ^-\beta)( \beta ^-\beta)')= E( ((X'X)^-1)X'u)((X'X)^-1)X'u)')= E((X'X)^-1)X'uu'X(X'X)^-1)= (X'X)^-1)X'E(uu')X(X'X)^-1=(X'X)^-1)X'(sigma sq)X(X'X)^-1=(sigma sq)(X'X)^-1)X'X(X'X)^-1= (sigma sq)(X'X)^-1 which is the desired variance covariance matrix.

Add a comment
Know the answer?
Add Answer to:
II. Derivations (You must show all your work for full credit.) i. Given the model y=XB+ɛ,...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 4. Consider the linear model Y = XB+e, where e MV N(0,021). (1) Derive the formula...

    4. Consider the linear model Y = XB+e, where e MV N(0,021). (1) Derive the formula for , the least square estimate of B, using the matrix notation (2) Show that ß is an unbiased estimate for B. (3) Derive the formula for var(), using matrix notation.

  • Consider the following linear regression model 1. For any X = x, let Y = xB+U,...

    Consider the following linear regression model 1. For any X = x, let Y = xB+U, where B erk. 2. X is exogenous. 3. The probability model is {f(u; ) is a distribution on R: Ef [U] = 0, VAR; [U] = 62,0 >0}. 4. Sampling model: {Y}}}=1 is an independent sample, sequentially generated using Y; = xiß +Ui, where the U; are IID(0,62). (i) Let K > 0 be a given number. We wish to estimate B using least-squares...

  • Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X,...

    Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2

  • 2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX...

    2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX) 0 and T1 0 0 01 0 r2 00 0 0 0 0.0 0 γΝ 0 00 Notice that as a covariance matrix, Σ is bymmetric and nonnegative definite () Derive Var (0LS|x). (ii) Let B- CY be any other linear unbiased estimator where C' is an N x K function of X. Prove Var (BIX) 2 (X-x)-1 3. An oracle...

  • Suppose we have the full rank linear model y = XA+ Ewiun xp design matrix X,...

    Suppose we have the full rank linear model y = XA+ Ewiun xp design matrix X, normal errors E N (0,0?Inxn). Let b be the least squares estimator of B. (C) Prove that (b-B)? XT X(6-8) o2 follows the x? distribution. Hint: Write Xb in terms of X, B and e. (d) Hence derive a 100(1 - a)% joint confidence region of ß given in notes (b - B) TXTX(b-)/po<Fa:pon-p, where Faip,n-p denotes the upper ath quantile of the Fpin-p...

  • please help Question 2. (2.5 points. You are considering the model Y = XB + X2B,...

    please help Question 2. (2.5 points. You are considering the model Y = XB + X2B, +€, where E(€) = 0 and E(ee') = oʻI,.. Here, X, is n xp and X, is n xq, where p >1 and q> 1. Suppose that in fact, unknown to you, B, = 0. In other words, (*) is an over-parameterized model. Let e be the vector of residuals corresponding to the fitted version of *) based on the least squares method. Does...

  • Q4.. [40 points] Consider the multiple linear regression model given by y - XB -+ s,...

    Q4.. [40 points] Consider the multiple linear regression model given by y - XB -+ s, where y and e are vectors of size 8 × 1, X ls a matrix of size 8 x 3 and Disa vector of sze 3 × 1. Also, the following information are available e = 22 y -2 and XTy 3 1. [10 points) Estimate the regression coefficients in the model given above? 2. [4 points] Estimate the variance of the error term...

  • Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep...

    Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep MV N(0,0²V) and V +In is a diagonal matrix. a) Derive the weighted least squares estimator for B, i.e., Owls. b) Show Bwis is an unbiased estimator for B. c) Derive the variances of w ls and the OLS estimator of 8. Is the OLS estimator of still the BLUE? In one sentence, explain why or why not.

  • 2. Given the following model: Y, = B. +X;B, + Mi a. Suppose we estimate the...

    2. Given the following model: Y, = B. +X;B, + Mi a. Suppose we estimate the model ignoring the constant term. Show that the resulting estimator (call it ß, ) is biased. b. Derive the variance

  • 3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear...

    3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear model y= +E, where u = (1, ..., and e is a vector of zero mean, uncorrelated errors with variance o'. This is a linear model in which the responses have a constant but unknown mean . We will call this model the location model. (a) If we write the location model in the usual form of the linear model y = X 8+...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT