Consider the simple linear regression model y - e, where the errors €1, ,en are iid....
#2 can you please go through the steps of how you got this Consider the simple linear regression model yi-Xißl + β0 + Ei, where the errors €1, €n are i.i.d. random variables with E[c]-0, var(G) σ2, i 1, , n. Solve either one of the questions below. 1. Let h be the least squares estimator for β1- Show that Bi is the best linear unbiased estimator for β1. (Note: you can read the proof in wikipedia, but you cannot...
1. Consider the simple linear regression model where Bo is known. a) Find the least squares estimator bi of B (b) Is this estimator unbiased? Prove your result. (c) Find an expression for Var(b1x1, ,xn) in terms of x1, ,xn and σ2.
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...
Q. 1 Consider the multiple linear regression model Y = x3 + €, where e indep MV N(0,0²V) and V +In is a diagonal matrix. a) Derive the weighted least squares estimator for B, i.e., Owls. b) Show Bwis is an unbiased estimator for B. c) Derive the variances of w ls and the OLS estimator of 8. Is the OLS estimator of still the BLUE? In one sentence, explain why or why not.
Consider the simple linear regression model where Bo is known. (a) Find the least squares estimator bi of β1- (b) Is this estimator unbiased? Prove your result
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
5.13. Suppose X1, X2, , xn are iid N(μ, σ2), where-oo < μ < 00 and σ2 > 0. (a) Consider the statistic cS2, where c is a constant and S2 is the usual sample variance (denominator -n-1). Find the value of c that minimizes 2112 var(cS2 (b) Consider the normal subfamily where σ2-112, where μ > 0. Let S denote the sample standard deviation. Find a linear combination cl O2 , whose expectation is equal to μ. Find the...