5. Suppose X and Y are random variables such that E(X)=E(Y) = θ, Var(X) = σ...
Suppose that E h ˆθ1 i = E h ˆθ2 i = θ, Var h ˆθ1 i = σ 2 1 , Var h ˆθ2 i = σ 2 2 , and Cov h ˆθ1, ˆθ2 i = σ12. Consider the unbiased estimator ˆθ3 = aˆθ1 + (1 − a) ˆθ2. What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of ˆθ3 as an estimator of θ?
S -E θ2 θ, Var | θί σ. Var 021-σ3. and Cov | θι.02 ov 61,02 σ12. Consider the uppose that E [1 unbiased estimator What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of 03 as an estimator of θ? Note: The second derivative of the variance function is positive, which you can figure out by knowing that the correlation coefficient ρ- 012 is between-1 and 1; however,...
= Var(X) and σ, 1. Let X and Y be random variables, with μx = E(X), μY = E(Y), Var(Y). (1) If a, b, c and d are fixed real numbers, (a) show Cov (aX + b, cY + d) = ac Cov(X, Y). (b) show Corr(aX + b, cY +d) pxy for a > 0 and c> O
5. Suppose X is a normally distributed random variable with mean μ and variance 2. Consider a new random variable, W=2X + 3. i. What is E(W)? ii. What is Var(W)? 6. Suppose the random variables X and Y are jointly distributed. Define a new random variable, W=2X+3Y. i. What is Var(W)? ii. What is Var(W) if X and Y are independent?
9. Let X and Y be two random variables. Suppose that σ = 4, and σ -9. If we know that the two random variables Z-2X?Y and W = X + Y are independent, find Cov(X, Y) and ρ(X,Y). 10. Let X and Y be bivariate normal random variables with parameters μェー0, σ, 1,Hy- 1, ơv = 2, and ρ = _ .5. Find P(X + 2Y < 3) . Find Cov(X-Y, X + 2Y) 11. Let X and Y...
Suppose that X and Y are independent random variables with the same unknown mean u. Both X and Y have a variance of 36. Let T = aX + bY be an estimator of u. What condition must a and b satisfy in order that T be an unbiased estimator for ? Is T a normal random variable?
Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random variables with mhean μ and variance a) Compute the expected value of W b) For what value of a is the variance of W a minimum? σ: Let W-aX + (1-a) Y, where 0 < a < 1. Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random...
Suppose X~Pois(A) and Y ~Pois(2A) are independent random variables. Consider a linear estimator of λ, that is, λ = aX + bY. (a) Find an expression for the bias of λ, in terms of a and b, and determine a condition on the values of a and b, such that λ is unbiased. (b) Of all the values of a and b that make the estimator unbiased, find the values of oa and b that minimize the variance of the...
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...