Consider the following simple regression model: where the e, are independent errors with E(ed-0 and var(et)-Ơ2X?...
Consider the simple linear regression model y - e, where the errors €1, ,en are iid. random variables with Eki-0, var(G)-σ2, i-1, .. . ,n. Solve either one of the questions below. 1. Let Bi be the least squares estimator for B. Show that B is the best linear unbiased estimator for B1. (Note: you can read the proof in wikipedia, but you cannot use the matrix notation in this proof.) 2. Consider a new loss function Lx(A,%) 71 where...
1. A simple regression model is given by Y81B2X+ e for t 1, (1) ,n errors e with Var (e) a follow AR(1) model where the regression et pet-1 + , t=1...n where 's are uncorrelated random variables with constant variance, that is, E()0, Var (v) = , Cov (, ,) 0 for t Now given that Var (e) = Var (e1-1)= , and Cov (e-1, v)0 (a) Show that (b) Show that E (ee-1)= p. (c) What problem(s) will...
Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator bi of β1- (b) Is this estimator unbiased? Prove your result
4. (20 pts) Consider the following regression model, i = 1,2. ,...n, N (0, 2/i) where , 1,2,...,n are independent, but c; ~ (a) Do you think if it is suitable for the (ordinary) least square regression technique to apply the data (x4, Y;)? Give a brief reasoning (b) Construct a transformed model so that you can use the ordinary least square method (c) Find the parameter estimates for the transformed model in (b) WIS (d) Find the weighted least...
(Do this problem without using R) Consider the simple linear regression model y =β0 + β1x + ε, where the errors are independent and normally distributed, with mean zero and constant variance σ2. Suppose we observe 4 observations x = (1, 1, −1, −1) and y = (5, 3, 4, 0). (a) Fit the simple linear regression model to this data and report the fitted regression line. (b) Carry out a test of hypotheses using α = 0.05 to determine...
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
(a) What is meant by heteroscedasticity? What are the effects of heteroscedasticity on: (i) The OLS estimators? In particular, does heteroscedasticity create bias in the OLS estimators? (ii) The variances and standard errors of the OLS estimators. (iii) The validity of t-test and F-test of overall significance of the regression? (b) Given: Yi = β1 + β2 Xi + ui Var(ui) = σ2 Xi Show how this model can be transformed so that the disturbances have constant variance. Explain how...
4. Consider the regression model, y1B22+ BKiK+ei -.. where errors may be heteroskedastic. Choose the most incorrect statement (a) The OLS estimators are consistent and unbiased (b) We should report the OLS estimates with the robust standard errors (c) The Gauss-Markov theorem may not apply (d) The GLS cannot be used because we do not know the error variances in practice (e) We should take care of heteroskedasticity only if homoskedasticity is rejected Consider the regression model, +BKIK+et e pet-1+...
#2 can you please go through the steps of how you got this Consider the simple linear regression model yi-Xißl + β0 + Ei, where the errors €1, €n are i.i.d. random variables with E[c]-0, var(G) σ2, i 1, , n. Solve either one of the questions below. 1. Let h be the least squares estimator for β1- Show that Bi is the best linear unbiased estimator for β1. (Note: you can read the proof in wikipedia, but you cannot...
5. Show that Var(Y)- Var(e in the simple linear regression model. (Yes, this should be that simple.) What did you assume?