Consider the regression model
where the εi are i.i.d. N(0,σ2) random variables, for i = 1, 2, . .
. , n.
(a) (4 points) Show βˆ is normally distributed with mean β and variance σ2 . 1 1SXX
Consider the regression model where the εi are i.i.d. N(0,σ2) random variables, for i = 1,...
2. Consider the simple linear regression model: where e1, .. . , es, are i.i.d. N (0, o2), for i= 1,2,... , n. Suppose that we would like to estimate the mean response at x = x*, that is we want to estimate lyx=* = Bo + B1 x*. The least squares estimator for /uyx* is = bo bi x*, where bo, b1 are the least squares estimators for Bo, Bi. ayx= (a) Show that the least squares estimator for...
In the simple linear regression with zero-constant item for (xi , yi) where i = 1, 2, · · · , n, Yi = βxi + i where {i} n i=1 are i.i.d. N(0, σ2 ). (a) Derive the normal equation that the LS estimator, βˆ, satisfies. (b) Show that the LS estimator of β is given by βˆ = Pn i=1 P xiYi n i=1 x 2 i . (c) Show that E(βˆ) = β, V ar(βˆ) = σ...
1. Consider the simple linear regression model: Ү, — Во + B а; + Ei, where 1, . . , En are i.i.d. N(0,02), for i1,2,... ,n. Let b1 = s^y/8r and bo = Y - b1 t be the least squared estimators of B1 and Bo, respectively. We showed in class, that N(B; 02/) Y~N(BoB1 T;o2/n) and bi ~ are uncorrelated, i.e. o{Y;b} We also showed in class that bi and Y 0. = (a) Show that bo is...
Please help with question 4
Consider the simple linear regression model: with σ2 is known. Assume x's are fixed and known, and only y's are random. Recall Ex 3.5.22 in Homework 1. Here the design matrix is 1 T2 and the regression coefficielt is β = (α, β)T, 3. Derive the MLE of a and ß and show that it is independent of σ2· Is your MLE sane as the least square estimation in Ex 3.5.22? 4. Drive the mean...
3. Consider the multiple linear regression model iid where Xi, . . . ,Xp-1 ,i are observed covariate values for observation i, and Ei ~N(0,ơ2) (a) What is the interpretation of B1 in this model? (b) Write the matrix form of the model. Label the response vector, design matrix, coefficient vector, and error vector, and specify the dimensions and elements for each. (c) Write the likelihood, log-likelihood, and in matrix form. aB (d) Solve : 0 for β, the MLE...
Consider the regression model: yi = β0 + β1Xi + εi for…. i = 1 Where the dummy variable (0 = failure and 1 = success). Suppose that the data set contains n1 failure and n2 successes (and that n1+n2 = n) Obtain the X^T(X) matrix Obtain the X^T(Y) matrix Obtain the least square estimate b
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
Problem 5 of 5Sum of random variables Let Mr(μ, σ2) denote the Gaussian (or normal) pdf with Inean ,, and variance σ2, namely, fx (x) = exp ( 2-2 . Let X and Y be two i.i.d. random variables distributed as Gaussian with mean 0 and variance 1. Show that Z-XY is again a Gaussian random variable but with mean 0 and variance 2. Show your full proof with integrals. 2. From above, can you derive what will be the...
4. (24 marks) Suppose that the random variables Yi,..., Yn satisfy Y-B BX,+ Ei, 1-1, , n, where βο and βι are parameters, X1, ,X, are con- stants, and e1,... ,en are independent and identically distributed ran- dom variables with Ei ~ N (0,02), where σ2 is a third unknown pa- rameter. This is the familiar form for a simple linear regression model, where the parameters A, β, and σ2 explain the relationship between a dependent (or response) variable Y...
Problem 7. Consider the simple linear regression model Y1 = Bo + BiX; +€; for i=1,2,...,n where the errors Eį are uncorrelated, have mean zero and common variance Varſei] = 02. Suppose that the Xį are in centimeters and we want to write the model in inches. If one centimeter = c inch with c known, we can write the above model as Yį = y +71 Zitki where Zi is Xi converted to inches. Can you obtain the least-squared...