Consider the following linear regression model 1. For any X x, let Y xBU, where 3...
Consider the following linear regression model 1. For any X = x, let Y = xB+U, where B erk. 2. X is exogenous. 3. The probability model is {f(u; ) is a distribution on R: Ef [U] = 0, VAR; [U] = 62,0 >0}. 4. Sampling model: {Y}}}=1 is an independent sample, sequentially generated using Y; = xiß +Ui, where the U; are IID(0,62). (i) Let K > 0 be a given number. We wish to estimate B using least-squares...
8.(15 POINTS) Consider the following optimization problem: Max xi + subject to : 5xí +60192 + 5x3 = 1 and 21 > 0,22 > 0. where 2 and 32 are choice variables. (a) Write the Lagrangean and the Kuhn-Tucker conditions. (6) State and verify the second order condition. Distinguish between sufficient and necessary condi tions. (c) Is the constraint qualification condition satisfied? Show clearly why or why not. (d) Solve the Kuhn-Tucker conditions for the optimal choice: x1, x, and...
Consider the following regression model: Xi = Bo + Bixi + y; where yi is individual i's University GPA and xi is the individual's high school grades. a. What do you think is in ui? Do you think E[ulx) = 0? Suggest a variable that you think might affect University GPA that isn't included in the regression equation but should be. c. What sign of bias would you expect in an OLS regression of y on x? Briefly explain. d....
Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2
5) Consider the simple linear regression model N(0, o2) i = 1,...,n Let g be the mean of the yi, and let â and ß be the MLES of a and B, respectively. Let yi = â-+ Bxi be the fitted values, and let e; = yi -yi be the residuals a) What is Cov(j, B) b) What is Cov(â, ß) c) Show that 1 ei = 0 d) Show that _1 x;e; = 0 e) Show that 1iei =...
Consider the simple linear regression model y - e, where the errors €1, ,en are iid. random variables with Eki-0, var(G)-σ2, i-1, .. . ,n. Solve either one of the questions below. 1. Let Bi be the least squares estimator for B. Show that B is the best linear unbiased estimator for B1. (Note: you can read the proof in wikipedia, but you cannot use the matrix notation in this proof.) 2. Consider a new loss function Lx(A,%) 71 where...
2. The linear regression model in matrix format is Y Χβ + e, with the usual definitions Let E(elX) 0 and T1 0 0 01 0 r2 00 0 0 0 0.0 0 γΝ 0 00 Notice that as a covariance matrix, Σ is bymmetric and nonnegative definite () Derive Var (0LS|x). (ii) Let B- CY be any other linear unbiased estimator where C' is an N x K function of X. Prove Var (BIX) 2 (X-x)-1 3. An oracle...
pls answer all the parts this is all the information 3. (a) Let L=L(x,y1, 41,99) where = əy/ax, i = 1, 2, be a Lagrangian satisfying the Euler-Lagrange equation which is independent of y2. Show that al constant. aya You are given that the motion in the plane of a particle of mass m has La- grangian L = (1+2 +r202) – V(r), where r and 0 are polar coordinates, V is the potential and the dots indicate derivatives with...
3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear model y= +E, where u = (1, ..., and e is a vector of zero mean, uncorrelated errors with variance o'. This is a linear model in which the responses have a constant but unknown mean . We will call this model the location model. (a) If we write the location model in the usual form of the linear model y = X 8+...