Suppose we have the full rank linear model y = XA+ Ewiun xp design matrix X,...
Q5. Suppose YON,(, oʻ1) and X is a n xp matrix of constants with rank p (<n). a) Show that A = X(X'X)'X' and I - A are idempotent and find the rank of each. b) If u is linear combination of columns of X i.e. u=Xb for some b find E(Y'AY) and E(Y'(I - A)Y) where A is an in (a) c) Find the distribution of Y'AY/? & Y'(I - A)Y/02 d) Show that Y'AY & Y'(I – A)Y...
Hello, please help solve problem and show all work thank you. (Linear models) Suppose we have a vector of n observations Y (response), which has distribution Nn(XB.ση where x is an n × p matrix of known values (indepedent variables), which has full column rank p, and β is a p x 1 vector of unknown parameters. The least squares estimator of ß is 4. a. Determine the distribution of β. xB. Determine the distribution of Y b. Let Y...
II. Derivations (You must show all your work for full credit.) i. Given the model y=XB+ɛ, derive the least squares estimate for ß? (10 points) ii. Show that B=(x+x)"x"y is an unbiased estimate for B.(10 points) ii. Given vlə) = E[(@–B\–B)], derive the variance- covariance matrix for the least squares estimator (10 points). iv. Given the model y=XB+ɛ, the transformation matrix T, and TTT=22-1, derive the GLS estimator (10 points).
Exercise 5 Consider a linear model with n = 2m in which Yi = Bo + Bici + Eigi = 1,..., m, and Yi = Bo + B2X1 + Ei, i = m + 1, ...,n. Here €1,..., En are i.i.d. from N(0,0), B = (Bo, B1, B2)' and o2 are unknown parameters, X1, ..., Xn are known constants with X1 + ... + Xm = Xm+1 + ... + Xn = 0. 1. Write the model in vector form...
in a Bayesian view. Consider the prior π(a)-1 for all a e R Consider a Gaussian linear model Y = aX+ E Determine whether each of the following statements is true or false. π(a) a uniform prior. (1) (a) True (b) False L(Y=y14=a,X=x) (2) π(a) is a jeffreys prior when we consider the likelihood (where we assume xis known) (a) True (b)False Y-XB+ σε where ε E R" is a random vector with Consider a linear regression model E[ε1-0, E[eErJ-1....
please help Question 2. (2.5 points. You are considering the model Y = XB + X2B, +€, where E(€) = 0 and E(ee') = oʻI,.. Here, X, is n xp and X, is n xq, where p >1 and q> 1. Suppose that in fact, unknown to you, B, = 0. In other words, (*) is an over-parameterized model. Let e be the vector of residuals corresponding to the fitted version of *) based on the least squares method. Does...
2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX) 0 and T1 0 0 01 0 r2 00 0 0 0 0.0 0 γΝ 0 00 Notice that as a covariance matrix, Σ is bymmetric and nonnegative definite () Derive Var (0LS|x). (ii) Let B- CY be any other linear unbiased estimator where C' is an N x K function of X. Prove Var (BIX) 2 (X-x)-1 3. An oracle...
1. Consider the following linear model y Xp+ €, where Cor(e)-021 with ơ e R+ being unknown. an estimable function, where C is a full column rank matrix of rank s. Let T'y be the Let C. β BLUE for CB Write down an explicit expression for T. It should be only in terms of C, y and X. a. basic result do you use to justify your answer? V Cov(Ty). hypothesis is H CB o. (Ty- d), where b....
Exercise 4.11 Consider the regression model Y Po PX+u Suppose that you know Bo 1. Derive the formula for the least squares estimator of p The least squares objective function is OA. n (v2-bo-bx?) i-1 Ов. O B. n (M-bo-bX) /# 1 n Click to select your answer and then click Check Answer. Exercise 4.11 OA n Σ (--B,χ?) O B. E (Y-bo-b,X)2 j= 1 n Σ (Υ-Βo-bΧ) 3. j= 1 D. n Σ (Υ-0-b,) i- 1 Click to select...
The standard linear regression model is: y = Xw+e, where X is an nxd matrix of predictor vari- ables, y is an n-dimensional vector of response variables, and e N (0,021) is an n-dimensional vector of model errors. (a) What is the PDF of y in terms of X,w, o?? N(0,p1). (b) Let the PDF from part (a) be denoted as fylw). Suppose also in this case that w Write an expression for the joint PDF of w and y...