4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Consider the regression model where the εi are i.i.d. N(0,σ2) random variables, for i = 1, 2, . . . , n. (a) (4 points) Show βˆ is normally distributed with mean β and variance σ2 . 1 1SXX Question 6 Consider the regression model y = Bo + B12 + 8 where the €, are i.i.d. N(0,0%) random variables, for i = 1,2, ..., n. (a) (4 points) Show B1 is normally distributed with mean B1 and variances
onsider the process Y, = Y + Σ|e, where Yo ~ (μ, σ2) and the e's are 0-mean, a stationary process? independent identically distributed random variables with variance 1. Is (Y How about the process ▽Yǐ = Yt-)t-1 ? Explain. onsider the process Y, = Y + Σ|e, where Yo ~ (μ, σ2) and the e's are 0-mean, a stationary process? independent identically distributed random variables with variance 1. Is (Y How about the process ▽Yǐ = Yt-)t-1 ? Explain.
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
1. A simple regression model is given by Y81B2X+ e for t 1, (1) ,n errors e with Var (e) a follow AR(1) model where the regression et pet-1 + , t=1...n where 's are uncorrelated random variables with constant variance, that is, E()0, Var (v) = , Cov (, ,) 0 for t Now given that Var (e) = Var (e1-1)= , and Cov (e-1, v)0 (a) Show that (b) Show that E (ee-1)= p. (c) What problem(s) will...
1. Let Y.Y2, ,y, be independent and identically distributed N(μ, σ2) random variables. Show that, where d() denotes the cumulative distribution function of standard normal [You need to show both the equalities]
Consider the simple linear regression model y - e, where the errors €1, ,en are iid. random variables with Eki-0, var(G)-σ2, i-1, .. . ,n. Solve either one of the questions below. 1. Let Bi be the least squares estimator for B. Show that B is the best linear unbiased estimator for B1. (Note: you can read the proof in wikipedia, but you cannot use the matrix notation in this proof.) 2. Consider a new loss function Lx(A,%) 71 where...
e. Consider the multiple regression model y X 3+E. with E(e)-0 and var (e) ơ21 Assume that ε ~ N(0 σ21), when we test the hypothesis Ho : βί-0 against Ha : βί 0 we use the t statistic with n-k-1 degrees of freedom. When Ho is not true find the expected value and variance of the test onsider the genera -~ 0 gains 0 1S not true find the expected value and variance of the test statistic. e. Consider...
Exercise 2b please! Exercise 1 Consider the regression model through the origin y.-β1zi-ci, where Ei ~ N(0,o). It is assumed that the regression line passes through the origin (0, 0) that for this model a: T N, is an unbiased estimator of o2. a. Show d. Show that (n-D2 ~X2-1, where se is the unbiased estimator of σ2 from question (a). Exercise2 Refer to exercise 1 a. Show that is BLUE (best linear unbiased estimator) b. Show that +1 has...
Consider the following simple regression model: where the e, are independent errors with E(ed-0 and var(et)-Ơ2X? a. In this case, would an ordinary least squares regression provide you with the best b. c. linear unbiased estimates? Why or why not? What is the transformed model that would give you constant error variance? Given the following data: y = (4,3,1,0,2) and x = (1,2,1,3,4) Find the generalized least squares estimates of β1 and β2 (Do this by hand! Not with excel)