1. (20 points) Consider the linear regression model y = a + Bt + ut, ut...
Consider the following model 1. Consider the following AR(1) model: a. Explain why this dynamic model violates TS'3 ZCM assumption made for the unbiasedness of the FDL model estimators. b. Show that 1 t-2 2. Consider the following random walk model: ytBo yt-1 +ut, t 0,1,...,T Show that ye 3o yt-3 + ut + Ut-1 +t-2 Suppose that yo - 0, show that yt - tPo + ut + ut-1++u, Suppose that that yo -0, and ut for all t...
Q3. [10 points [Serial Correlation Consider a simple linear regression model with time series data: Suppose the error ut is strictly exogenous. That is Moreover, the error term follows an AR(1) serial correlation model. That where et are uncorrelated, and have a zero mean and constant variance a. 2 points Will the OLS estimator of P be unbiased? Why or why not? b. [3 points Will the conventional estimator of the variance of the OLS estimator be unbiased? Why or...
Consider a population linear regression model: Yt=β0 + β1Xt + ut Calculate: 1. Variance 2. Covariance of ut and Xt 3. β0 4. β1
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
Consider the following AR(1) model: 1. a. Explain why this dynamic model violates TS'3 ZCM assumption made for the unbiasedness of the FDL model estimators. the following random 2. Consider walk model: yeBo yt-1 +ut, t-0,1,..,T a. Show that yt-3βο + yt-3 + ut + ut-1 + ut-2. b. Suppose that 0-0, show that y.-t βο +4 + ut-1 + + u! c. Suppose that that yo -0, and ut for all t are ii.d. with mean 0 and variance...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep MV N(0,0²V) and V +In is a diagonal matrix. a) Derive the weighted least squares estimator for B, i.e., Owls. b) Show Bwis is an unbiased estimator for B. c) Derive the variances of w ls and the OLS estimator of 8. Is the OLS estimator of still the BLUE? In one sentence, explain why or why not.
.,n, 1.4. Show that in a linear regression model yt- B1t2, t-1, the squaredmultiple correlation coefficient based on the least squares estimates βί, β2 and Ut : Ảxt +A is necessarily between zero and one with R1 if and only if yt,t- 0,... , n (see L.12)) R2-1
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...
1. Consider the following simple regression model: y = β0 + β1x1 + u (1) and the following multiple regression model: y = β0 + β1x1 + β2x2 + u (2), where x1 is the variable of primary interest to explain y. Which of the following statements is correct? a. When drawing ceteris paribus conclusions about how x1 affects y, with model (1), we must assume that x2, and all other factors contained in u, are uncorrelated with x1. b....