3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
Consider the least-squares residuals ei-yi-yi, 1, 2, . . . , linear regression model. Find the variance of the residuals Var(e). Is the vari- ance of the residuals a constant? Discuss. n,from the simple
5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that is, the y's are equicorrelated. (a) Show that Σ can be written in the form Σ-σ2(I-P)1+a (b) Show that Σ-i(vi-y?/(r2(1-p] is X2(n-1) 5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that...
In the simple linear regression with zero-constant item for (xi , yi) where i = 1, 2, · · · , n, Yi = βxi + i where {i} n i=1 are i.i.d. N(0, σ2 ). (a) Derive the normal equation that the LS estimator, βˆ, satisfies. (b) Show that the LS estimator of β is given by βˆ = Pn i=1 P xiYi n i=1 x 2 i . (c) Show that E(βˆ) = β, V ar(βˆ) = σ...
5 points Suppose that Yi N(0, σ ). Write out the likelihood for the data and show that it is equivalently to using ordinary least squares = β0 +너=12'ij8; + ei where ei, , en are iid. distributed from a 5 points Suppose that Yi N(0, σ ). Write out the likelihood for the data and show that it is equivalently to using ordinary least squares = β0 +너=12'ij8; + ei where ei, , en are iid. distributed from a
2. Let Yi-Au + β124 + εί, (jz 1,2, . . . ,n), where the εί are independent N(0, σ2). ow that the correlation coefficient ofBo and βί is-n /(nDx (b) Derive an F-statistic for testing H : β-0. 2. Let Yi-Au + β124 + εί, (jz 1,2, . . . ,n), where the εί are independent N(0, σ2). ow that the correlation coefficient ofBo and βί is-n /(nDx (b) Derive an F-statistic for testing H : β-0.
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
Consider the zero intercept model given by Yi = B1Xi + ei (i=1,…,n) with the ei normal, independent, with variance sigma^2. For this mode (i) find the sum of (Yi –Yi-hat). (ii) find the sum of (Yi – Yi-hat)Xi. (iii) find the estimator of the error variance, sigma^2. (iv) is the estimator of the error variance biased?
Exercise5 Consider a linear model with n -2m in which yi Bo Pi^i +ei,i-1,...,m, and Here €1, ,En are 1.1.d. from N(0,ơ), β-(A ,A, β), and σ2 are unknown parameters, zı, known constants with x1 +... + Xm-Tm+1 + +xn0 , zn are 1, write the model in vector form as Y = Xß+ε describing the entries in the matrix X. 2, Determine the least squares estimator β of β. Exercise5 Consider a linear model with n -2m in which...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...