3. Consider a forecasting model with a trend where t is the t index for t...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator bi of β1- (b) Is this estimator unbiased? Prove your result
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
1. A series of observations is assumed to follow the trend model: where t (n + 1)/2 -5 and et is white noise. Write down the least squares estimates c, b of c,b in terms of y, y2.n The next term in the series, yio, is predicted as Express this in the form W, w2y2+ and obtain the numerical values of w2, . 10 marks
4. (20 pts) Consider the following regression model, i = 1,2. ,...n, N (0, 2/i) where , 1,2,...,n are independent, but c; ~ (a) Do you think if it is suitable for the (ordinary) least square regression technique to apply the data (x4, Y;)? Give a brief reasoning (b) Construct a transformed model so that you can use the ordinary least square method (c) Find the parameter estimates for the transformed model in (b) WIS (d) Find the weighted least...
Problem 3 Consider the non-constant variance linear model Y, =Be + B111,1 + B,1,2 + ... + Bp-111.-1+€, (1) with G N(0,0?), i=1,..., 7. Define the reciprocal of the variance of as the weight w, and let w 0 0 W OW... 0... 02. We can estimate the non-constant variance model by minimizing the objective function w (9-Bo-Buca -.- Bp-111p-1) Task: Derive the weighted least squares equation (3) Bu = (XWX)-{XWY
For observations {Y, X;}=1, recall that for the model Y = 0 + Box: +e the OLS estimator for {00, Bo}, the minimizer of E. (Y: - a - 3x), is . (X.-X) (Y-Y) and a-Y-3X. - (Xi - x) When the equation (1) is the true data generating process, {X}- are non-stochastic, and {e} are random variables with B (ei) = 0, B(?) = 0, and Ele;e;) = 0 for any i, j = 1,2,...,n and i j, we...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
2. (a) Let us consider a full model of a balanced (all t treatments have equal number of observations r) CRD design with t treatments and r replications of each treatment, hence having n rt observations. . Minimizing sum of square error Δ/u114%)-ΣΊ ΣΊ (Vij-μ-%)2 with respect to μ and Ti find the least square estimators of μ and Ti as μ and T. Hint: Take derivative of the objective function with respect to μ and Ti and equate then...
Exercise 4.11 Consider the regression model Y Po PX+u Suppose that you know Bo 1. Derive the formula for the least squares estimator of p The least squares objective function is OA. n (v2-bo-bx?) i-1 Ов. O B. n (M-bo-bX) /# 1 n Click to select your answer and then click Check Answer. Exercise 4.11 OA n Σ (--B,χ?) O B. E (Y-bo-b,X)2 j= 1 n Σ (Υ-Βo-bΧ) 3. j= 1 D. n Σ (Υ-0-b,) i- 1 Click to select...