For the following problems assume y-Xß + ε and assume E(e)-0 and var(e)-σ21 1. We are...
For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 2. The vector of predicted values is defined as У-Xß. Show var(y) = σ2X(XTX)-XT.
For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 3. Show that έ (1-X (XTX)-XT) ε. (Hint: Use the assumption that y + ε)
Need help with #5, please. Thank you For the following problems assume y = Χβ + ε and assume E(e) 0 and var(e) σ21. 3. Show that έ = (1-X (XTX)-XT) ε. (Hint: Use the assumption that y-Xß + ε) 5. Use the identity in (3) and the assumption that var(e)-σ21 to show that,
Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e. Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e.
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...
1.Given the Multiple Linear regression model as Y-Po + β.X1 + β2X2 + β3Xs + which in matrix notation is written asy-xß +ε where -έ has a N(0,a21) distribution + + ßpXo +ε A. Show that the OLS estimator of the parameter vector B is given by B. Show that the OLS in A above is an unbiased estimator of β Hint: E(β)-β C. Show that the variance of the estimator is Var(B)-o(Xx)-1 D. What is the distribution o the...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
Exercise 2.6: Consider the models y Xßte and y* X"β+c" where E(e) = 0, cov(e) = σ21, y* = ГУ, X* = ГХ, e* =「ε and r is a known n x n orthogonal matrix. Show that: 1. E(e) 0, cov(e) σ21 2. b b and s2 s2, where b and b' are the least squares estimates of β and 82 and s+2 are the estimates of σ2 obtained from the two models.
in a Bayesian view. Consider the prior π(a)-1 for all a e R Consider a Gaussian linear model Y = aX+ E Determine whether each of the following statements is true or false. π(a) a uniform prior. (1) (a) True (b) False L(Y=y14=a,X=x) (2) π(a) is a jeffreys prior when we consider the likelihood (where we assume xis known) (a) True (b)False Y-XB+ σε where ε E R" is a random vector with Consider a linear regression model E[ε1-0, E[eErJ-1....
The random vector Y = (Y1, ..., Yn)T is such that Y = Xβ + ε, where X is an n × p full-rank matrix of known constants, β is a p-length vector of unknown parameters, and ε is an n-length vector of random variables. A multiple linear regression model is fitted to the data. (a) Write down the multiple linear regression model assumptions in matrix format. (b) Derive the least squares estimator β^ of β. (c) Using the data:...