2. Given the following model: Y, = B. +X;B, + Mi a. Suppose we estimate the...
Suppose we have a sample of observations for the pair of random variable (X, Y) in the following 2 x2 Show that the odds ratio can be estimated by ad/bc and derive an estimate of the variance of this estimator Suppose we have a sample of observations for the pair of random variable (X, Y) in the following 2 x2 Show that the odds ratio can be estimated by ad/bc and derive an estimate of the variance of this estimator
Question 2 (10 points) You are given the following model y-put ei. Consider two alternative estimators of β, b2xvix? and b = Zy/X 1. Which estimator would you choose and why if the model satisfies all the assumptions of classical regression? Prove your results. (4 points) 2. Now suppose that var(y)-hxi, where h is a positive constant (a) Obtain the correct variance of the OLS estimator. (2 points) (b) Show that the BLU estimator is now 6. Derive its variance....
II. Derivations (You must show all your work for full credit.) i. Given the model y=XB+ɛ, derive the least squares estimate for ß? (10 points) ii. Show that B=(x+x)"x"y is an unbiased estimate for B.(10 points) ii. Given vlə) = E[(@–B\–B)], derive the variance- covariance matrix for the least squares estimator (10 points). iv. Given the model y=XB+ɛ, the transformation matrix T, and TTT=22-1, derive the GLS estimator (10 points).
Question 5. Given sample data (x, y), and sample size n. We fit the simple regression model: and estimate the least square estimators (a) Suppose A,-1, ß,-2, and x-1. Compute у. b) Suppose S and sry 0.5, compute the R2. Question 5. Given sample data (x, y), and sample size n. We fit the simple regression model: and estimate the least square estimators (a) Suppose A,-1, ß,-2, and x-1. Compute у. b) Suppose S and sry 0.5, compute the R2.
Problem 2. (Regression without intercept, 50 pts) Suppose you are given the model: Y; = BX; + Ui, E[u;|Xį] = 0. A) Derive the OLS estimator ß. B) After you estimate B, you can obtain the residual û; = Y; – ĢXį. Does 21-1 Ûi = 0? Explain why and show your derivation.
Suppose we have a regression model Yi = bXi + Ei where Y = X = 0 and there is no intercept in the model. Consider a slope estimator ĥ - E(X;)2(Y;) 2(x;)2 Show whether this will yield an unbiased estimate of b or not.
Suppose we have the full rank linear model y = XA+ Ewiun xp design matrix X, normal errors E N (0,0?Inxn). Let b be the least squares estimator of B. (C) Prove that (b-B)? XT X(6-8) o2 follows the x? distribution. Hint: Write Xb in terms of X, B and e. (d) Hence derive a 100(1 - a)% joint confidence region of ß given in notes (b - B) TXTX(b-)/po<Fa:pon-p, where Faip,n-p denotes the upper ath quantile of the Fpin-p...
(6)Suppose that we estimate the model: y = ap + a2 +e, when the true model was y = Bo + B12+ B2x+u. Under what conditions and in what direction will ái be biased.
Problem 3: Absence of Intercept Consider the regression model Y, = BX,+", where , and X, satisfy Assumptions SLR1-SLR5. Y (i) Let B denote an estimator of B that is constructed as P where Y and X as are the sample means of Y,and X,, respectively. Show that B is conditionally unbiased. Derive the least squares estimator of B. Show that the estimator is conditionally unbiased. Derive the conditional variance of the estimator. (ii) (iii) (iv) 2
3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear model y= +E, where u = (1, ..., and e is a vector of zero mean, uncorrelated errors with variance o'. This is a linear model in which the responses have a constant but unknown mean . We will call this model the location model. (a) If we write the location model in the usual form of the linear model y = X 8+...