Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator...
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
3. Consider simple linear regression model yi = Bo + B12; + &; and B. parameter estimate of the slope coefficient Bi: Find the expectation and variance of 31. Is parameter estimate B1 a) unbiased? b) linear on y? c) effective optimal in terms of variance)? What will be your answers if you know that there is no intercept coefficient in your model?
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
Consider the following simple regression model: where the e, are independent errors with E(ed-0 and var(et)-Ơ2X? a. In this case, would an ordinary least squares regression provide you with the best b. c. linear unbiased estimates? Why or why not? What is the transformed model that would give you constant error variance? Given the following data: y = (4,3,1,0,2) and x = (1,2,1,3,4) Find the generalized least squares estimates of β1 and β2 (Do this by hand! Not with excel)
Consider the simple linear regression model y - e, where the errors €1, ,en are iid. random variables with Eki-0, var(G)-σ2, i-1, .. . ,n. Solve either one of the questions below. 1. Let Bi be the least squares estimator for B. Show that B is the best linear unbiased estimator for B1. (Note: you can read the proof in wikipedia, but you cannot use the matrix notation in this proof.) 2. Consider a new loss function Lx(A,%) 71 where...
6. This problem considers the simple linear regression model, that is, a model with a single covariate r that has a linear relationship with a response y. This simple linear regression model is y = Bo + Bix +, where Bo and Bi are unknown constants, and a random error has normal distribution with mean 0 and unknown variance o' The covariate a is often controlled by data analyst and measured with negligible error, while y is a random variable....
1. For the general multivariate regression model, the least squares estimator is given by Show that for the slope estimator in the simple (bivariate) regression case, this is equivalent to ja! įs] 2. In the general multivariate regression model, the variance of the least squares estimator, Va( is σ2(XX)". Show that for the simple regression case, this is equivalent to a. Var(B- b. Var(B)o i, Σ (Xi-X) 2 C. What is the covariance between β° and β,?
Consider a simple linear regression model with nonstochastic regressor: Yi = β1 + β2Xi + ui. 1. [3 points] What are the assumptions of this model so that the OLS estimators are BLUE (best linear unbiased estimates)? 2. [4 points] Let βˆ and βˆ be the OLS estimators of β and β . Derive βˆ and βˆ. 12 1212 3. [2 points] Show that βˆ is an unbiased estimator of β .22
Find the estimator beta_hat in multivariate linear
regression.
Multivariate Linear Regression Parameter Estimation Ordinary Least Squares The ordinary least squares (OLS) problem is n m BER(p+1)×m BERP+1)xm に1 に1 where || . || denotes the Frobenius norm. The OLS solution has the form where bx and yk denote the k-th columns of B and Y, respectively.