Suppose we are given data on n observations (zi, Y4), i-1, . . . , n,...
2. Suppose we are given data on n observations (zi, y), î i, . . . , n, and we have a linear model, so that E (Y,) = Ao +Ari. Let A = SXY/Sxx and A,-F-Ax be the least-square estimates given in lecture. (a) Show that E(SXY)-ASxx and E(y)-Ao +AT. (b) Use (a) to show that E (A)-A and E(A)-A- In other words, these are unbiased estimators (c) The fitted values Yī = β0+812 i are used as estimates...
2. Suppose we are given data on n observations (zi,Y), i = 1, , n, and we have a linear model, so that E(X) = β0+Axi. Let ßi-SXY/SXX and β0 = Y-Ax be the least-square estimates given in lecture. (a) Show that E(5xx) = ẢSXX and E(T) = β0+A2. (b) Use (a) to show that E(A) = and E(%) = A- In other words, these are unbiased estimators (c) The fitted values Y BotBr, are used as estimates of E(Y),...
2. Suppose we are given data on n observations (x,Y), i 1,... , n, and we have a linear model, = SXY/SXX and A,-ㄚ-Ax be the least-square estimates so that E(X) = β0 +ATp Let given in lecture. (a) Show that E(5xx)-A5xx and E(Y)-Ao +A2. (b) Use (a) to show that E(A)-A and E(A)-A. În other words, these are unbiased estimators (c) The fitted values Yi = ArtAz; are used as estimates of E(K), and the residuals ei = Y-...
2 2. Suppose we are given data on n observations (i, Y),, and we have a linear model, so that E(X)-A, + ßiri-Let呙-SXY /SXX and β') = F-β,2 be the least-square estimates given in lecture (a) Show that E(SXY)-ASXX and E (T)-A] + β,7. (b) Use (a) to show that E(角)-βι and E(A) = 3). In other words, these are unbiased estimators. (c) The fitted values Yt = Atari are used as estimates of E(A), and the residuals e.-Yi for...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
4. We have n statistical units. For unit i, we have (x; yi), for i 1,2,...,n. We used the least squares line to obtain the estimated regression line bobi . (a) Show that the centroid (z, y) is a point on the least squares line, where x-(1/n) Σ-Χί and у-(1/ n) Σ|-1 yi. (Hint: Evaluate the line at x x.) (b) In the suggested exercises, we showed that e,-0 and where e is the ith residual, that is e -y...
4. We have n statistical units. For unit i, we have (xi; yi), for i-1,2,... ,n. We used the least squares line to obtain the estimated regression line у = bo +biz. (a) Show that the centroid (x, y) is a point on the least squares line, where x = (1/n) and у = (1/n) Σ¡ı yi. (Hint: E ) i-1 valuate the line at x = x. (b) In the suggested exercises, we showed that e,-0 and e-0, where...
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
Considering multiple linear regression models, we compute the regression of Y, an n x 1 vector, on an n x (p+1) full rank matrix X. As usual, H = X(XT X)-1 XT is the hat matrix with elements hij at the ith row and jth column. The residual is e; = yi - Ýi. (a) (7 points) Let Y be an n x 1 vector with 1 as its first element and Os elsewhere. Show that the elements of the...
Considering multiple linear regression models, we compute the regression of Y, an n x 1 vector, on an n x (p+1) full rank matrix X. As usual, H = X(XT X)-1 XT is the hat matrix with elements hij at the ith row and jth column. The residual is e; = yi - Ýi. (a) (7 points) Let Y be an n x 1 vector with 1 as its first element and Os elsewhere. Show that the elements of the...