Suppose we have a regression model (x, > 0) with n samples of i.i.d. data 0, Varuir] 2, and (a) O...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
2. In the regression model Y-Χβ+ E, Xis a fixed n x k matrix of rank k S11, E(c)-0 and E(es')-σ2Ω where Ω is a known non-singular matrix. The GiLS estimator of B is given by the formula Consider the following data 16 31 2 3 51 4 10 Assuming that Ay a) b) Calculate the GLS estimate of β in the model Y,Xß + ε Calculate the OLS estimate c) Compare it the two estimates and comment on efficiency.
Consider the regression model where the εi are i.i.d. N(0,σ2) random variables, for i = 1, 2, . . . , n. (a) (4 points) Show βˆ is normally distributed with mean β and variance σ2 . 1 1SXX Question 6 Consider the regression model y = Bo + B12 + 8 where the €, are i.i.d. N(0,0%) random variables, for i = 1,2, ..., n. (a) (4 points) Show B1 is normally distributed with mean B1 and variances
2. Suppose we observe the pairs (X, Y), i-1, , n and fit the simple linear regression (SLR) model Consider the test H0 : β,-0 vs. Ha : Aメ0. (a) What is the full model? Write the appropriate matrices Y and X. (b) What is the full model SSE? (c) What is the reduced model? Write the appropriate matrix XR. (d) What is the reduced model SSE? (e) Simplify the F statistics of the ANOVA test of Ho B10 vs....
2. Suppose we have the simple regression model Y =a+8X:+E, and their OLS coefficient estimators a and b. Answer the following questions. (a) Suppose we multiply X, by 1/2 for all i and do the OLS estimation again using X as the regressor (the independent variable). What will be your new estimators, denoted by ă (intercept) and b (slope)? Compare them with the original OLS estimators a and b, respectively (b) Compare Var[b] and Var[b]. Are they the same or...
1.Given the Multiple Linear regression model as Y-Po + β.X1 + β2X2 + β3Xs + which in matrix notation is written asy-xß +ε where -έ has a N(0,a21) distribution + + ßpXo +ε A. Show that the OLS estimator of the parameter vector B is given by B. Show that the OLS in A above is an unbiased estimator of β Hint: E(β)-β C. Show that the variance of the estimator is Var(B)-o(Xx)-1 D. What is the distribution o the...
Taking the yellow parts below as a model to solve the question above. Thank you!!!!!!!! Prove that the OLS estimator As for β in the linear regression model is consistent Let's first show that the OLS estimator is consistent Recall the result for β LS-(Lil Xix;厂E-1 xīYi Using Yi = X(B* + ui By the WLLN Assuming that E(X,X is non-negative definite (so that its inverse exists) and using Slutsky's theorem It follows In words: ßOLs converges in probability to...
Question 1 Let Y = 1 +2X + u where X = Zn4Z2, u = Z1-22, Z1 and Z2 are independent standard normals. We have iid observations (X,, Y1 from this model. (a) Suppose we run the following regression to obtain the OLS estimator β0Ls. What would you expect the value of β0LS to be when no Please give a numerical answer (b) Suppose we run the following regression to obtain the OLS estimator γ0Ls. What would you expect the...
Question 2 (10 points) You are given the following model y-put ei. Consider two alternative estimators of β, b2xvix? and b = Zy/X 1. Which estimator would you choose and why if the model satisfies all the assumptions of classical regression? Prove your results. (4 points) 2. Now suppose that var(y)-hxi, where h is a positive constant (a) Obtain the correct variance of the OLS estimator. (2 points) (b) Show that the BLU estimator is now 6. Derive its variance....