1. Consider the simple linear regression model where Bo is known. a) Find the least squares...
Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator bi of β1- (b) Is this estimator unbiased? Prove your result
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Consider the simple linear regression model y - e, where the errors €1, ,en are iid. random variables with Eki-0, var(G)-σ2, i-1, .. . ,n. Solve either one of the questions below. 1. Let Bi be the least squares estimator for B. Show that B is the best linear unbiased estimator for B1. (Note: you can read the proof in wikipedia, but you cannot use the matrix notation in this proof.) 2. Consider a new loss function Lx(A,%) 71 where...
3. Consider simple linear regression model yi = Bo + B12; + &; and B. parameter estimate of the slope coefficient Bi: Find the expectation and variance of 31. Is parameter estimate B1 a) unbiased? b) linear on y? c) effective optimal in terms of variance)? What will be your answers if you know that there is no intercept coefficient in your model?
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
1. For the general multivariate regression model, the least squares estimator is given by Show that for the slope estimator in the simple (bivariate) regression case, this is equivalent to ja! įs] 2. In the general multivariate regression model, the variance of the least squares estimator, Va( is σ2(XX)". Show that for the simple regression case, this is equivalent to a. Var(B- b. Var(B)o i, Σ (Xi-X) 2 C. What is the covariance between β° and β,?
6. This problem considers the simple linear regression model, that is, a model with a single covariate r that has a linear relationship with a response y. This simple linear regression model is y = Bo + Bix +, where Bo and Bi are unknown constants, and a random error has normal distribution with mean 0 and unknown variance o' The covariate a is often controlled by data analyst and measured with negligible error, while y is a random variable....
Consider the least-squares residuals ei-yi-yi, 1, 2, . . . , linear regression model. Find the variance of the residuals Var(e). Is the vari- ance of the residuals a constant? Discuss. n,from the simple
Consider the simple linear regression model: Yi = Bo + Bilitei, i = 1,...,n. with the least squares estimates ỘT = (Bo ß1). We observe a new value of the predictor: x] = (1 xo). Show that the expression for the 100(1 - a)% prediction interval reduces to the following: . (xo – x2 Ēo + @130 Etap 11+ntan (x; – 7)2
Exercise 5 Consider a linear model with n = 2m in which Yi = Bo + Bici + Eigi = 1,..., m, and Yi = Bo + B2X1 + Ei, i = m + 1, ...,n. Here €1,..., En are i.i.d. from N(0,0), B = (Bo, B1, B2)' and o2 are unknown parameters, X1, ..., Xn are known constants with X1 + ... + Xm = Xm+1 + ... + Xn = 0. 1. Write the model in vector form...