We will prove or disprove this using least square estimates of b.
since, E(b cap)≠b ,there ther b cap is not unbiased estimate of b.
linear regression model yi= a + bxi +ei calculate 95% confidence interval of b assuming ei ~N(0,o2)
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2
1. Consider a regression model Yi = x;ß +ei, i = 1,...,n. You estimate this model using the OLS estimator. (a) Present and discuss assumptions for the OLS estimation.
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
1. Suppose the data is generated by model yi = B2.+ Ej. Suppose further that E( X) = 0, var(EX) = o2 and ( yi) is iid with finite fourth moment and and are jointly normal. But you mistakenly estimate it using the following model: y = a1 + 02.1; +e, and obtain the estimated coefficient parameters. Without looking at the analysis report, determine whether the following statement is true or false. please briefly explain. (a) lê = 0 (b)...
2. (10pt) Consider a linear regression model without the intercept: Y = BiXiEi, where E(ei) = 0 and V (e;) = a2. What is the LSE of B here?
3. Consider simple linear regression model yi = Bo + B12; + &; and B. parameter estimate of the slope coefficient Bi: Find the expectation and variance of 31. Is parameter estimate B1 a) unbiased? b) linear on y? c) effective optimal in terms of variance)? What will be your answers if you know that there is no intercept coefficient in your model?