a. Consider the multiple regression model y = XB + €, with E(e) = 0 and var(e linear function c'3 of B. Show that t...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
Econometrics 13) Consider the classical linear regression model y = XB + E, EN(0,021) The data are collected in such a way that the X matrix is orthogonal, that is X'X = 1. We want to test the null hypothesis that Ho: B1 + B2 + ... + Bx = 0. For this particular hypothesis, the standard t-test for a single linear restriction r' B = q reduces to ki bi a) t= i=1 b) t = svk Ek=1b c)t...
Q4.. [40 points] Consider the multiple linear regression model given by y - XB -+ s, where y and e are vectors of size 8 × 1, X ls a matrix of size 8 x 3 and Disa vector of sze 3 × 1. Also, the following information are available e = 22 y -2 and XTy 3 1. [10 points) Estimate the regression coefficients in the model given above? 2. [4 points] Estimate the variance of the error term...
4. Consider the linear model Y = XB+e, where e MV N(0,021). (1) Derive the formula for , the least square estimate of B, using the matrix notation (2) Show that ß is an unbiased estimate for B. (3) Derive the formula for var(), using matrix notation.
The linear regression model in matrix format is Y Xe, with the usual definitions. Let E(elX)- 0 and γ1 0 0 0 Y2 00 01 0 00 .0 0 0 00N 0 0 0'YN 0 0 0YNL Notice that as a covariance matrix, Σ is symmetric and nonnegative definite. ) Derive Var (BoLSX). (ii) Let A: = CY be any other linear unbiased estimator where C, is an N × K function of X. Prove Var (β|X) > (X'Σ-1X)-1. The...
Consider the following linear regression model 1. For any X = x, let Y = xB+U, where B erk. 2. X is exogenous. 3. The probability model is {f(u; ) is a distribution on R: Ef [U] = 0, VAR; [U] = 62,0 >0}. 4. Sampling model: {Y}}}=1 is an independent sample, sequentially generated using Y; = xiß +Ui, where the U; are IID(0,62). (i) Let K > 0 be a given number. We wish to estimate B using least-squares...
5. Show that Var(Y)- Var(e in the simple linear regression model. (Yes, this should be that simple.) What did you assume?
2.25 Consider the simple linear regression model y = Bo + B x + E, with E(E) = 0, Var(e) = , and e uncorrelated. a. Show that Cov(Bo, B.) =-TOP/Sr. b. Show that Cov(5, B2)=0. in very short simple way
3. Consider the multiple linear regression model iid where Xi, . . . ,Xp-1 ,i are observed covariate values for observation i, and Ei ~N(0,ơ2) (a) What is the interpretation of B1 in this model? (b) Write the matrix form of the model. Label the response vector, design matrix, coefficient vector, and error vector, and specify the dimensions and elements for each. (c) Write the likelihood, log-likelihood, and in matrix form. aB (d) Solve : 0 for β, the MLE...
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...