(3) Suppose that E(4) θ, E(4) θ, V(4) σ. and V(0) σ3. Assume that θί and...
(3) Suppose that E (0,) θ, Ε(92) θ,V(91) of, and V(02) σ . Assume that 0, and θ2 are independent. Consider the following estimator: 6, - a+(1 -@ (a) Show that @g is unbiased for θ (b) Find the value of a that minimizes the variance of 03 (c) Which estimator would you use? θί,02, or th when using the value of a found in part (b)
(3) Suppose that E (0,) θ, Ε(92) θ,V(4) of, and V(92)-σ . Assume that 0, and θ2 are independent. Consider the following estimator: (a) Show that a, is unbiased for θ (b) Find the value of a that minimizes the variance of 83 (c) Which estimator would you use? 01.02, or 얘 when using the value of a found in part (b)
S -E θ2 θ, Var | θί σ. Var 021-σ3. and Cov | θι.02 ov 61,02 σ12. Consider the uppose that E [1 unbiased estimator What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of 03 as an estimator of θ? Note: The second derivative of the variance function is positive, which you can figure out by knowing that the correlation coefficient ρ- 012 is between-1 and 1; however,...
. Suppose that 6, and o2 are both unbiased estimators of e. a) b) e) Show that theestimator θ t914(1-t)a, is also an unbiased estimator of θ for any value of the constant t. Suppose V[6]:ơİ and v[62] of. Ifa, anda,are independent, find an expression for V[d] in terms of t, σ' and σ Find the value of t that produces an estimator of the form 6 ะเอิ,+(1 that has the smallest possible variance. (Your final answer will be in...
4. Let ,, , xn be independent and suppose that E(X.) k,0 + bi, for known constants ki and bi, and Var(X) = σ2, i 1, , n. (a) Find the least squares estimator θ of θ. (b) Show that θ is unbiased. c) Show that the variance of θ is Var(8)-: T (e) Show that the variance of is Var() (d) Show that Tn Σ(x,-ke-W2 = Σ(x,-k9-b)2 + Σ ka@ー0)2 i-1 -1 ー1 (e) Hence show that Ti 121
I REALLY need numbers 2 and 3 and 5 by like tomorrow morning. I have no clue how to do these. I know the image quality is iffy but please help as best you can Homework1 STA4322 Homework 1, Spring 2019 Please turn in your own work, though you may discuss the problems with classmates, the TA, the Professor, the internet, etc. The most important thing is that you understand the problems and how they are solved as they will...
3. [20 marks] Consider the multinomial distribution with 3 categories, where the random variables Xi, X2 and X3 have the joint probability function where x = (zi, 2 2:23), θ = (θί, θ2), n = x1 + 2 2 + x3, θι, θ2 > 0 and 1-0,-26, > 0. (a) [4 marks] Find the maximum likelihood estimator θ of θ. (b) [4 marks] Find that the Fisher information matrix I(0) (c) [4 marks] Show that θ is an MVUE. (d)...
5. Suppose X and Y are random variables such that E(X)=E(Y) = θ, Var(X) = σ and Var(Y)-吆 . Consider a new random variable W = aX + (1-a)Y (a) Show that W is unbiased for θ. (b) If X and Y are independent, how should the constant a be chosen in order to minimize the variance of W?
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
I figured out 1,2 and 3 but I’m stuck on 4 and 5. Please help me out if you can!! I know the quality isn’t the greatest, I’m sorry!! Homework1 STA4322 Homework 1, Spring 2019 Please turn in your own work, though you may discuss the problems with classmates, the TA, the Professor, the internet, etc. The most important thing is that you understand the problems and how they are solved as they will prepare you for the exam. Please...