linear regression model yi= a + bxi +ei calculate 95% confidence interval of b assuming ei ~N(0,o2)
We need at least 10 more requests to produce the answer.
0 / 10 have requested this problem solution
The more requests, the faster the answer.
linear regression model yi = a + bxi +ei calculate 95% confidence interval of B assuming ei~N(0,o2)
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
Regression analysis 1.3. Use the statistical model Yi Bo1Xi+ € to show that ei ~NID(0, o2) implies each of the following: (a) E(Y)Bo B1X, (b) 2(Y2, and (c) Cov(Y,Y)= 0, i i' For Parts (b) and (c), use the following definitions of variance and covariance o2(Y Y E(Y)]} Cov(Yi, Y) E{[Y-E(Y)Y- E(Y)] 1.3. Use the statistical model Yi Bo1Xi+ € to show that ei ~NID(0, o2) implies each of the following: (a) E(Y)Bo B1X, (b) 2(Y2, and (c) Cov(Y,Y)= 0,...
Consider the least-squares residuals ei-yi-yi, 1, 2, . . . , linear regression model. Find the variance of the residuals Var(e). Is the vari- ance of the residuals a constant? Discuss. n,from the simple
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
2. Consider a simple linear regression model for a response variable Yi, a single predictor variable ri, i-1,... , n, and having Gaussian (i.e. normally distributed) errors Ý,-BzitEj, Ejį.i.d. N(0, σ2) This model is often called "regression through the origin" since E(Yi) 0 if xi 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function. (Hint: The function...
1. Consider a regression model Yi = x;ß +ei, i = 1,...,n. You estimate this model using the OLS estimator. (a) Present and discuss assumptions for the OLS estimation.
under the assumptions of the linear regression model and cov(Ei,Ej) 0 prove that CON(W,Y):0 under the assumptions of the linear regression model and cov(Ei,Ej) 0 prove that CON(W,Y):0
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...