2. Consider a simple linear regression i ion model for a response variable Y, a single...
2. Consider a simple linear regression model for a response variable Yi, a single predictor variable ri, i-1,... , n, and having Gaussian (i.e. normally distributed) errors Ý,-BzitEj, Ejį.i.d. N(0, σ2) This model is often called "regression through the origin" since E(Yi) 0 if xi 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function. (Hint: The function...
Hi all, I need help with these questions. Here is my work so far and in b am having trouble showing it is a "unique" maximizer for variance. I would also appreciate it if someone with a good heard can also do the rest of the problems. Thank you in advance. 2. Consider a simple linear regression model for a response variable Y, a single predictor variable xi, i-1,..., n, and having Gaussian (i.e. normally distributed) errors: -Bai +Ej, Eii.i.d....
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
3. Consider the linear model: Yİ , n where E(Ei)-0. Further α +Ari + Ei for i 1, assume that Σ.r.-0 and Σ r-n. (a) Show that the least square estimates (LSEs) of α and ß are given by à--Ỹ and (b) Show that the LSEs in (a) are unbiased. (c) Assume that E(e-σ2 Yi and E(49)-0 for all i where σ2 > 0. Show that V(β)--and (d) Use (b) and (c) above to show that the LSEs are consistent...
Please help with question 4 Consider the simple linear regression model: with σ2 is known. Assume x's are fixed and known, and only y's are random. Recall Ex 3.5.22 in Homework 1. Here the design matrix is 1 T2 and the regression coefficielt is β = (α, β)T, 3. Derive the MLE of a and ß and show that it is independent of σ2· Is your MLE sane as the least square estimation in Ex 3.5.22? 4. Drive the mean...
2. Suppose we are given data on n observations (zi, y), î i, . . . , n, and we have a linear model, so that E (Y,) = Ao +Ari. Let A = SXY/Sxx and A,-F-Ax be the least-square estimates given in lecture. (a) Show that E(SXY)-ASxx and E(y)-Ao +AT. (b) Use (a) to show that E (A)-A and E(A)-A- In other words, these are unbiased estimators (c) The fitted values Yī = β0+812 i are used as estimates...
2 2. Suppose we are given data on n observations (i, Y),, and we have a linear model, so that E(X)-A, + ßiri-Let呙-SXY /SXX and β') = F-β,2 be the least-square estimates given in lecture (a) Show that E(SXY)-ASXX and E (T)-A] + β,7. (b) Use (a) to show that E(角)-βι and E(A) = 3). In other words, these are unbiased estimators. (c) The fitted values Yt = Atari are used as estimates of E(A), and the residuals e.-Yi for...
Consider the multiple regression model y = X3 + €, with E(€)=0 and var(€)=oʻI. Problem 1 Gauss-Mrkov theorem (revisited). We already know that E = B and var() = '(X'X)". Consider now another unbiased estimator of 3, say b = AY. Since we are assuming that b is unbiased we reach the conclusion that AX = I (why?). The Gauss-Markov theorem claims that var(b) - var() is positive semi-definite which asks that we investigate q' var(b) - var() q. Show...
in a Bayesian view. Consider the prior π(a)-1 for all a e R Consider a Gaussian linear model Y = aX+ E Determine whether each of the following statements is true or false. π(a) a uniform prior. (1) (a) True (b) False L(Y=y14=a,X=x) (2) π(a) is a jeffreys prior when we consider the likelihood (where we assume xis known) (a) True (b)False Y-XB+ σε where ε E R" is a random vector with Consider a linear regression model E[ε1-0, E[eErJ-1....
We consider a multiple linear regression model with LIFE (y) as the response variable, and MALE (x1), BIRTH (x2), DIVO (x3), BEDS (x4), EDUC (x5), and INCO (x6), as predictors. "STATE" "MALE" "BIRTH" "DIVO" "BEDS" "EDUC" "INCO" "LIFE" AK 119.1 24.8 5.6 603.3 14.1 4638 69.31 AL 93.3 19.4 4.4 840.9 7.8 2892 69.05 AR 94.1 18.5 4.8 569.6 6.7 2791 70.66 AZ 96.8 21.2 7.2 536.0 12.6 3614 70.55 CA 96.8 18.2 5.7 649.5 13.4 4423 71.71 CO 97.5...