Exercise 5 Consider a linear model with n = 2m in which Yi = Bo +...
Exercise5 Consider a linear model with n -2m in which yi Bo Pi^i +ei,i-1,...,m, and Here €1, ,En are 1.1.d. from N(0,ơ), β-(A ,A, β), and σ2 are unknown parameters, zı, known constants with x1 +... + Xm-Tm+1 + +xn0 , zn are 1, write the model in vector form as Y = Xß+ε describing the entries in the matrix X. 2, Determine the least squares estimator β of β. Exercise5 Consider a linear model with n -2m in which...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
A simple linear regression model is given as follows Yi = Bo + B1Xi+ €i, for i = 1, ...,n, where are i.i.d. following N (0, o2) distribution. It is known that x4 n, and x = 0, otherwise. Denote by n2 = n - ni, Ji = 1 yi, and j2 = 1 1. for i = 1, ... ,n1 < n2 Lizn1+1 Yi. n1 Zi=1 1. Find the least squares estimators of Bo and 31, in terms of...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
7.22. In the regression model Y; = Bo + B1Xi + B2(3X} – 2) +Ei, i = 1,2,3, with X1 = -1, X2 = 0, and X3 = 1, what happens to the least squares estimates of Bo and B1 when B2 = 0? Why?
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
1. Consider a regression model Yi = x;ß +ei, i = 1,...,n. You estimate this model using the OLS estimator. (a) Present and discuss assumptions for the OLS estimation.
Consider the simple linear regression model: Yi = Bo + Bilitei, i = 1,...,n. with the least squares estimates ỘT = (Bo ß1). We observe a new value of the predictor: x] = (1 xo). Show that the expression for the 100(1 - a)% prediction interval reduces to the following: . (xo – x2 Ēo + @130 Etap 11+ntan (x; – 7)2
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Consider the following slope estimator: b=2i=1 Yi Suppose the true model is ki + Bo + Bicite and the model satisfies the Gauss-Markov conditions. Answer the following questions: (a) What assumption in addition to the Gauss-Markov assumptions is required to estimate the model? (b) Show that in general, b is a biased estimator of B1. (c) Outline the special condition(s) under which b is an unbiased estimator of B1.