Suppose that E h ˆθ1 i = E h ˆθ2 i = θ, Var h ˆθ1 i = σ 2 1 , Var h ˆθ2 i = σ 2 2 , and Cov h ˆθ1, ˆθ2 i = σ12. Consider the unbiased estimator ˆθ3 = aˆθ1 + (1 − a) ˆθ2. What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of ˆθ3 as an estimator of θ?
S -E θ2 θ, Var | θί σ. Var 021-σ3. and Cov | θι.02 ov 61,02 σ12. Consider the uppose that E [1 unbiased estimator What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of 03 as an estimator of θ? Note: The second derivative of the variance function is positive, which you can figure out by knowing that the correlation coefficient ρ- 012 is between-1 and 1; however,...
5. Suppose X and Y are random variables such that E(X)=E(Y) = θ, Var(X) = σ and Var(Y)-吆 . Consider a new random variable W = aX + (1-a)Y (a) Show that W is unbiased for θ. (b) If X and Y are independent, how should the constant a be chosen in order to minimize the variance of W?
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...
(3) Suppose that E(4) θ, E(4) θ, V(4) σ. and V(0) σ3. Assume that θί and 02 are independent. Consider the following estimator: (a) Show that θ3 is unbiased for θ (b) Find the value of a that minimizes the variance of θ3 (c) which estimator would you use? θ, θ2, or 6, when using the value of a found in part (b)
4. Let ,, , xn be independent and suppose that E(X.) k,0 + bi, for known constants ki and bi, and Var(X) = σ2, i 1, , n. (a) Find the least squares estimator θ of θ. (b) Show that θ is unbiased. c) Show that the variance of θ is Var(8)-: T (e) Show that the variance of is Var() (d) Show that Tn Σ(x,-ke-W2 = Σ(x,-k9-b)2 + Σ ka@ー0)2 i-1 -1 ー1 (e) Hence show that Ti 121
(3) Suppose that E (0,) θ, Ε(92) θ,V(91) of, and V(02) σ . Assume that 0, and θ2 are independent. Consider the following estimator: 6, - a+(1 -@ (a) Show that @g is unbiased for θ (b) Find the value of a that minimizes the variance of 03 (c) Which estimator would you use? θί,02, or th when using the value of a found in part (b)
(3) Suppose that E (0,) θ, Ε(92) θ,V(4) of, and V(92)-σ . Assume that 0, and θ2 are independent. Consider the following estimator: (a) Show that a, is unbiased for θ (b) Find the value of a that minimizes the variance of 83 (c) Which estimator would you use? 01.02, or 얘 when using the value of a found in part (b)
linear stat modeling & regression please , i need the solution for Q3, but i copy Q2 because you need info from Q2 in order to answer Q3. 2) Suppose you have multiple regression set up YxXBp The ridge regression estimator is given by Here, llell'-Σ.< where is a vector of Vik. a) Find the expectation and variance-covariance matrix of Bridge, when X'X is a diagonal matrix with each diagonal entry is eqal to. Com pare these variances with the...
Consider the following assumptions: 1. ?? = ?(? + ??) (data generating process) 2. E(?? ) = 0 for all 3. Var(?? ) = ? 2 for all i 4. Cov(?? , ?? ) for ? ≠ ? 5. ?? ∼ ?????? And suppose you’re interested in generating an estimate for ?. a. What is the expected value of the sample mean estimator, ?̂= 1 ? ∑?? , under these assumptions? Is ?̂an unbiased estimator for ?? Show all work...