Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje w...
For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 2. The vector of predicted values is defined as У-Xß. Show var(y) = σ2X(XTX)-XT.
Exercise 2.6: Consider the models y Xßte and y* X"β+c" where E(e) = 0, cov(e) = σ21, y* = ГУ, X* = ГХ, e* =「ε and r is a known n x n orthogonal matrix. Show that: 1. E(e) 0, cov(e) σ21 2. b b and s2 s2, where b and b' are the least squares estimates of β and 82 and s+2 are the estimates of σ2 obtained from the two models.
For the following problems assume y-Xß + ε and assume E(e)-0 and var(e)-σ21 1. We are often interested in estimating where r. is a (p+ 1)x 1 vector of predictors. We will find that a reasonable estimator is Find var(y*).
1.Given the Multiple Linear regression model as Y-Po + β.X1 + β2X2 + β3Xs + which in matrix notation is written asy-xß +ε where -έ has a N(0,a21) distribution + + ßpXo +ε A. Show that the OLS estimator of the parameter vector B is given by B. Show that the OLS in A above is an unbiased estimator of β Hint: E(β)-β C. Show that the variance of the estimator is Var(B)-o(Xx)-1 D. What is the distribution o the...
Need help with #5, please. Thank you For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 3. Show that έ = (1-X (XTX)-XT) ε. (Hint: Use the assumption that y-Xß + ε) 5. Use the identity in (3) and the assumption that var(e)-σ21 to show that,
For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 3. Show that έ (1-X (XTX)-XT) ε. (Hint: Use the assumption that y + ε)
Let βˆ = (X′X)−1X′y where y ∼ N(Xβ,σ2I), X is an n×(k+1) matrix, and β is a (k+1)×1 vector. Are βˆ′A′[A(X′X)−1A′]−1Aβˆ and y′[I − X(X′X)−1X′]y independent? Let B (X'X)-X'y where y ~ N(XB,02I), X is an n x (k+ 1) matrix, and B is a (k+1) x1 vector Are BA A (X'X)-A]-AB and yI - X(X'X)-xy independent? Let B (X'X)-X'y where y ~ N(XB,02I), X is an n x (k+ 1) matrix, and B is a (k+1) x1 vector Are...
Let Y = Xβ + ε be the linear model where X be an n × p matrix with orthonormal columns (columns of X are orthogonal to each other and each column has length 1) Let be the least-squares estimate of β, and let be the ridge regression estimate with tuning parameter λ. Prove that for each j, . Note: The ridge regression estimate is given by: The least squares estimate is given by: We were unable to transcribe this...
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...
2. In the regression model Y-Χβ+ E, Xis a fixed n x k matrix of rank k S11, E(c)-0 and E(es')-σ2Ω where Ω is a known non-singular matrix. The GiLS estimator of B is given by the formula Consider the following data 16 31 2 3 51 4 10 Assuming that Ay a) b) Calculate the GLS estimate of β in the model Y,Xß + ε Calculate the OLS estimate c) Compare it the two estimates and comment on efficiency.