2 2. Suppose we are given data on n observations (i, Y),, and we have a...
2. Suppose we are given data on n observations (zi, y), î i, . . . , n, and we have a linear model, so that E (Y,) = Ao +Ari. Let A = SXY/Sxx and A,-F-Ax be the least-square estimates given in lecture. (a) Show that E(SXY)-ASxx and E(y)-Ao +AT. (b) Use (a) to show that E (A)-A and E(A)-A- In other words, these are unbiased estimators (c) The fitted values Yī = β0+812 i are used as estimates...
2. Suppose we are given data on n observations (x,Y), i 1,... , n, and we have a linear model, = SXY/SXX and A,-ㄚ-Ax be the least-square estimates so that E(X) = β0 +ATp Let given in lecture. (a) Show that E(5xx)-A5xx and E(Y)-Ao +A2. (b) Use (a) to show that E(A)-A and E(A)-A. În other words, these are unbiased estimators (c) The fitted values Yi = ArtAz; are used as estimates of E(K), and the residuals ei = Y-...
2. Suppose we are given data on n observations (zi,Y), i = 1, , n, and we have a linear model, so that E(X) = β0+Axi. Let ßi-SXY/SXX and β0 = Y-Ax be the least-square estimates given in lecture. (a) Show that E(5xx) = ẢSXX and E(T) = β0+A2. (b) Use (a) to show that E(A) = and E(%) = A- In other words, these are unbiased estimators (c) The fitted values Y BotBr, are used as estimates of E(Y),...
Suppose we are given data on n observations (zi, Y4), i-1, . . . , n, and we have a linear model so that E(X)-β0+B1zi. Let A-SXY/SXX and A-Y-Aī be the least-square estimates given in lecture. (a) Show that E(Sxy)-ASxx and E(Y)-A +AF (b) Use (a) to show that E(BB and E(B)In other words, these are unbiased estimators (c) The fitted values Yi-A+Azi are used as estimates of %), and the residuals e,-x-Y; are used as surrogates for the unobservable...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
2. Consider a simple linear regression i ion model for a response variable Y, a single predictor variable ,i1.., n, and having Gaussian (i.e. normally distributed) errors: This model is often called "regression through the origin" since E(X) = 0 if xi = 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function Hint: The function g(x)log(x) +1-x...
4. We have n statistical units. For unit i, we have (xi; yi), for i-1,2,... ,n. We used the least squares line to obtain the estimated regression line у = bo +biz. (a) Show that the centroid (x, y) is a point on the least squares line, where x = (1/n) and у = (1/n) Σ¡ı yi. (Hint: E ) i-1 valuate the line at x = x. (b) In the suggested exercises, we showed that e,-0 and e-0, where...
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
4. We have n statistical units. For unit i, we have (x; yi), for i 1,2,...,n. We used the least squares line to obtain the estimated regression line bobi . (a) Show that the centroid (z, y) is a point on the least squares line, where x-(1/n) Σ-Χί and у-(1/ n) Σ|-1 yi. (Hint: Evaluate the line at x x.) (b) In the suggested exercises, we showed that e,-0 and where e is the ith residual, that is e -y...