4) Consider n data points with 2 covariates and observation {xi,i, Vi,2, yi); i -1,... ,n,...
1) Consider n data points with 3 covariates and observations {xil, Гіг, xī,3, yi); i-1,.,n, and you fit the following model, y Bo+B+B32+Br+e that is yi-An + ßiXiut Ali,2 + Asri,3 + Ei where є,'s are independent normal distribution with mean zero and variance ơ2 For a observed covariate vector-(1, ri, ^2, r3) (with the intercept and three regressor variables) and observed yg at that point a) write the expression for estimated variance for the fit zs at z. (Let...
Suppose that the covariates Xj,i for i 1, 2, , n and j 1, 2, , indicator variables for a single categorical variable in the manner covered in the course. Thus, suppose that for each individual i = 1,2,…,n we have that X1.i, X2.i,...,Xd,i this one is equal to the number 1. Let Bk be the (A , . . . , β 1), the minimizer of L (bi , b2, . . . ,勿of eq. (B. = Yn.(k), where...
1. Given data on (yi, xi) for i = 1, , n, consider the following least square problem for a imple linear regression bo,b We assume the four linear regression model assumptions dicussed in class hold (i) Compute the partial derivatives of the objective function. (ii) Put the derived partial derivatives in (i) equal to zeros. Explain why the resulting equa tions are called normal equation'. (Hin wo n-dimesional vectors (viand (wi)- are normal-orthogonal ) if Σ-1 ui wi-0. )...
3. Consider the model Yi = Bo + Bixi +Ei, i = 1,2,3,.., N. Xi = Oo + Qiyi + Ui, i = 1,2,3,.., N. Find the relationship between the estimates of a and B.
2. Consider a simple linear regression model for a response variable Yi, a single predictor variable ri, i-1,... , n, and having Gaussian (i.e. normally distributed) errors Ý,-BzitEj, Ejį.i.d. N(0, σ2) This model is often called "regression through the origin" since E(Yi) 0 if xi 0 (a) Write down the likelihood function for the parameters β and σ2 (b) Find the MLEs for β and σ2, explicitly showing that they are unique maximizers of the likelihood function. (Hint: The function...
Consider the zero intercept model given by Yi = B1Xi + ei (i=1,…,n) with the ei normal, independent, with variance sigma^2. For this mode (i) find the sum of (Yi –Yi-hat). (ii) find the sum of (Yi – Yi-hat)Xi. (iii) find the estimator of the error variance, sigma^2. (iv) is the estimator of the error variance biased?
1. Suppose the data is generated by model yi = B2.+ Ej. Suppose further that E( X) = 0, var(EX) = o2 and ( yi) is iid with finite fourth moment and and are jointly normal. But you mistakenly estimate it using the following model: y = a1 + 02.1; +e, and obtain the estimated coefficient parameters. Without looking at the analysis report, determine whether the following statement is true or false. please briefly explain. (a) lê = 0 (b)...
3. Consider the multiple linear regression model iid where Xi, . . . ,Xp-1 ,i are observed covariate values for observation i, and Ei ~N(0,ơ2) (a) What is the interpretation of B1 in this model? (b) Write the matrix form of the model. Label the response vector, design matrix, coefficient vector, and error vector, and specify the dimensions and elements for each. (c) Write the likelihood, log-likelihood, and in matrix form. aB (d) Solve : 0 for β, the MLE...
linear stat modeling & regression 1) Consider n data points with 3 covariates and observations {xn, ^i2, xi3,yid; i,,n, and you fit the following model, y Bi+Br2+Br+e that is yi A) +Ari,1 +Ari,2 +Buri,3 + єї where є,'s are independent normal distribution with mean zero and variance ơ2 . H the vectors of (Y1, . . . ,Yn). Assume the covariates are centered: Σίχί,,-0, k = 1,2,3. ere, n = 50, Let L are Assume, X'X is a diagonal matrix...
In the simple linear regression with zero-constant item for (xi , yi) where i = 1, 2, · · · , n, Yi = βxi + i where {i} n i=1 are i.i.d. N(0, σ2 ). (a) Derive the normal equation that the LS estimator, βˆ, satisfies. (b) Show that the LS estimator of β is given by βˆ = Pn i=1 P xiYi n i=1 x 2 i . (c) Show that E(βˆ) = β, V ar(βˆ) = σ...