For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e)...
Need help with #5, please. Thank you For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 3. Show that έ = (1-X (XTX)-XT) ε. (Hint: Use the assumption that y-Xß + ε) 5. Use the identity in (3) and the assumption that var(e)-σ21 to show that,
For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 2. The vector of predicted values is defined as У-Xß. Show var(y) = σ2X(XTX)-XT.
For the following problems assume y-Xß + ε and assume E(e)-0 and var(e)-σ21 1. We are often interested in estimating where r. is a (p+ 1)x 1 vector of predictors. We will find that a reasonable estimator is Find var(y*).
Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e. Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e.
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...
1.Given the Multiple Linear regression model as Y-Po + β.X1 + β2X2 + β3Xs + which in matrix notation is written asy-xß +ε where -έ has a N(0,a21) distribution + + ßpXo +ε A. Show that the OLS estimator of the parameter vector B is given by B. Show that the OLS in A above is an unbiased estimator of β Hint: E(β)-β C. Show that the variance of the estimator is Var(B)-o(Xx)-1 D. What is the distribution o the...
2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX) 0 and T1 0 0 01 0 r2 00 0 0 0 0.0 0 γΝ 0 00 Notice that as a covariance matrix, Σ is bymmetric and nonnegative definite () Derive Var (0LS|x). (ii) Let B- CY be any other linear unbiased estimator where C' is an N x K function of X. Prove Var (BIX) 2 (X-x)-1 3. An oracle...
Exercise 2.6: Consider the models y Xßte and y* X"β+c" where E(e) = 0, cov(e) = σ21, y* = ГУ, X* = ГХ, e* =「ε and r is a known n x n orthogonal matrix. Show that: 1. E(e) 0, cov(e) σ21 2. b b and s2 s2, where b and b' are the least squares estimates of β and 82 and s+2 are the estimates of σ2 obtained from the two models.
2. In the regression model Y-Χβ+ E, Xis a fixed n x k matrix of rank k S11, E(c)-0 and E(es')-σ2Ω where Ω is a known non-singular matrix. The GiLS estimator of B is given by the formula Consider the following data 16 31 2 3 51 4 10 Assuming that Ay a) b) Calculate the GLS estimate of β in the model Y,Xß + ε Calculate the OLS estimate c) Compare it the two estimates and comment on efficiency.
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....