f Squares and Properties of Estimators o. Let xi yi denote two series ofn numbers xi:...
012. (a) The ordinary least squares estimate of B in the classical linear regression model Yi = α + AXi + Ui ; i=1,2, , n and xi = Xi-K, X-n2Xī i- 1 Show that if Var(B-.--u , no other linear unbiased estimator of β n im1 can be constructed with a smaller variance. (All symbols have their usual meaning) 18
Please give detailed steps. Thank you. 5. Let {X1, X2,..., Xn) denote a random sample of size N from a population d escribed by a random variable X. Let's denote the population mean of X by E(X) - u and its variance by Consider the following four estimators of the population mean μ : 3 (this is an example of an average using only part of the sample the last 3 observations) (this is an example of a weighted average)...
Assume that Yi k Ynk are i.i.d. variables following a N(uk,02) distribution (k E Denote by Y the sample mean for sample k. { 1,2 ). a. Derive the distribution of Assume now that σ is not known and is estimated by the pooled variance S: It can be shown that en-2nx(2n -2) C. Show that S. is an unbiased estimator of the common variance σ 2 d. Show that T has a t(2n - 2) distribution.
Let X, denote a binary variable and consider the regressions Yi = A + Ax, + ui , Let Yo denote the sample mean for observations with X0 and let Y1 denote the sample mean for observations with X-1. Show that β,-Ý, Άο + β,-Ý, , and A-R-Ý, 6.
Suppose Xi, X2, ,Xn is an iid N(μ, c2μ2 sample, where c2 is known. Let μ and μ denote the method of moments and maximum likelihood estimators of μ, respectively. (a) Show that ~ X and μ where ma = n-1 Σηι X? is the second sample (uncentered) moment. (b) Prove that both estimators μ and μ are consistent estimators. (c) Show that v n(μ-μ)-> N(0, σ ) and yM(^-μ)-+ N(0, σ ). Calculate σ and σ . Which estimator...
4. Let X1,X2, x 2) distribution, and let sr_ Ση:1 (Xi-X)2 and S2 n-l Σηι (Xi-X)2 be the estimators of σ2. (i) Show that the MSE of S" is smaller than the MSE of S2 (ii) Find ElvS2] and suggest an unbiased estimator of σ. n be a random sample from N (μ, σ
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
3. Let Xi, . . . , Xn be iid randoln variables with mean μ and variance σ2. Let, X denote the sample mean and V-Σ, (X,-X)2. (a) Derive the expected values of X and V. (b) Further suppose that Xi,-.,X, are normally distributed. Let Anxn ((a)) an orthogonal matrix whose first rOw 1S be , ..*) and iet Y = AX, where Y (Yİ, ,%), ard X-(XI, , X.), are (column) vectors. (It is not necessary to know aij...
4. (a) Let Xi,X ,x, be n observations from an N(u2) distribution, and define the estimators (i) Determine whether T and T2 are unbiased estimators of u. 4 points (ii) Compute the variances Var(Ti), and Var(T2). Which is the better estimator T or T2 -and why? [2 points] Determine the maximum likelihood estimator of μ. (iii) [5 points) (b) A manufacturer is testing the performance of two products, A and B. At each of 20 field sites, product A and...
Which of the following is not one of the least squares assumptions used in Stock and Watson to show that the OLS estimators are unbiased and consistent and have approximately a normal distribution in large samples? 1) large outliers are unlikely 2) the error term is homoskedastic, i.e., Var(ui ∣ X=x) does not depend on x 3) the sample (Xi,Yi),i=1,…,n constitutes an i.i.d. random sample from the population joint distribution of X and Y 4) the conditional mean of the...