Suppose that X and Y are independent random variables with the same unknown mean u. Both X and Y have a variance of 36. Let T = aX + bY be an estimator of u. What condition must a and b satisfy in order that T be an unbiased estimator for ? Is T a normal random variable?
Given T=aX+bY
for T to be an unbiased estimator, Expectation(E) should be equal to the population mean u.
this means that
E(T)=aE(X)+bE(Y)
to be an unbiased estimator, E(X)=U and E(Y)=U
Thus U=aU+bU so the condition needed is that a+b should be equal to 1.
Yes, T is a normal random variable with a mean U and variance=a var(X)+b var(Y)=a(36)+b(36)=36 as a+b=1.
Suppose that X and Y are independent random variables with the same unknown mean u. Both...
(a) Let X and Y be independent random variables both with the same mean u +0. Define a new random variable W = ax + by, where a and b are constants. (i) Obtain an expression for E(W). (ii) What constraint is there on the values of a and b so that W is an unbiased estimator of u? Hence write all unbiased versions of W as a formula involving a, X and Y only (and not b). [2]
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
QUESTION 3 Suppose that Y, Y2, ., Y, are independent variables such that Y, =Bx? +€,, != 1,2,,n, where B is an unknown parameter, X1, X2, X, are known real numbers (+0), and €1. €2. ,€, are independent random errors each with a normal distribution with mean 0 and variance o (a) Show that is an unbiased estimator of B What is the variance of the estimator? (b) Show that the least squares estimator of B is not the same...
Suppose X~Pois(A) and Y ~Pois(2A) are independent random variables. Consider a linear estimator of λ, that is, λ = aX + bY. (a) Find an expression for the bias of λ, in terms of a and b, and determine a condition on the values of a and b, such that λ is unbiased. (b) Of all the values of a and b that make the estimator unbiased, find the values of oa and b that minimize the variance of the...
Let X1,, Xn be independent and identically distributed random variables with unknown mean μ and unknown variance σ2. It is given that the sample variance is an unbiased estimator of ơ2 Suggest why the estimator Xf -S2 might be proposed for estimating 2, justify your answer
Suppose that X',.X% are independent, both distributed normally with an unknown mean u and variance 4. a. Check ifXi +X2 is sufficient for μ. b. Give an unbiased estimator of u10. c. Is your estimator in part (b) the UMVUE of +10? If not, provide the UMUE for +10. Suppose that X',.X% are independent, both distributed normally with an unknown mean u and variance 4. a. Check ifXi +X2 is sufficient for μ. b. Give an unbiased estimator of u10....
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
Let Yı, Y2, Ys, and Y4 be independent, identically distributed random variables from a mean u and a variance 02. Consider a different estimator of u: W=Y+Y2+2Y3+ Y 00 This is an example of a weighted average of the Y a) Show that W is a linear estimator. b) Is W an unbiased estimator of u? Show that it is - or it isn't (E(W) = Find the variance of W and compare it to the variance of the sample...
Let X1 and X2 be independent random variables with mean μ and variance σ2. Suppose we have two estimators 1 (1) Are both estimators unbiased estimatros for θ? (2) Which is a better estimator?