Suppose X~Pois(A) and Y ~Pois(2A) are independent random variables. Consider a linear estimator of λ, that...
Suppose that X and Y are independent random variables with the same unknown mean u. Both X and Y have a variance of 36. Let T = aX + bY be an estimator of u. What condition must a and b satisfy in order that T be an unbiased estimator for ? Is T a normal random variable?
5. Suppose X and Y are random variables such that E(X)=E(Y) = θ, Var(X) = σ and Var(Y)-吆 . Consider a new random variable W = aX + (1-a)Y (a) Show that W is unbiased for θ. (b) If X and Y are independent, how should the constant a be chosen in order to minimize the variance of W?
(a) Let X and Y be independent random variables both with the same mean u +0. Define a new random variable W = ax + by, where a and b are constants. (i) Obtain an expression for E(W). (ii) What constraint is there on the values of a and b so that W is an unbiased estimator of u? Hence write all unbiased versions of W as a formula involving a, X and Y only (and not b). [2]
Let Y, Y2, Yz and Y4 be independent, identically distributed random variables from a population with mean u and variance o. Let Y = -(Y, + Y2 + Y3 +Y4) denote the average of these four random variables. i. What are the expected value and variance of 7 in terms of u and o? ii. Now consider a different estimator of u: W = y + y + y +Y4 This an example of weighted average of the Y. Show...
QUESTION 3 Suppose that Y, Y2, ., Y, are independent variables such that Y, =Bx? +€,, != 1,2,,n, where B is an unknown parameter, X1, X2, X, are known real numbers (+0), and €1. €2. ,€, are independent random errors each with a normal distribution with mean 0 and variance o (a) Show that is an unbiased estimator of B What is the variance of the estimator? (b) Show that the least squares estimator of B is not the same...
3. Suppose that X and Y are independent exponentially distributed random variables with parameter λ, and further suppose that U is a uniformly distributed random variable between 0 and 1 that is independent from X and Y. Calculate Pr(X<U< Y) and estimate numerically (based on a visual plot, for example) the value of λ that maximizes this probability.
t X and Y be independent random variables with variance ơ1 and ơ3. respectively. Consider the sum. Z=aX + (1-a)% 0 < a < 1 Le Find a that minimizes the variance of Z
Let Yı, Y2, Ys, and Y4 be independent, identically distributed random variables from a mean u and a variance 02. Consider a different estimator of u: W=Y+Y2+2Y3+ Y 00 This is an example of a weighted average of the Y a) Show that W is a linear estimator. b) Is W an unbiased estimator of u? Show that it is - or it isn't (E(W) = Find the variance of W and compare it to the variance of the sample...
Problem 3 Consider the linear MMSE estimator to the case where our estimation of a random variable Y is based on observations of multiple random variables, say XXX. Then, our linear MMSE estimator can be e written in the following fom: (a) Show that the optimal values of aa,a.a for the linear LMSE estimator is given as where E(X, a, Cxx is an covariance matrix of X,,X,...Xv and cxy is a cross-correlation vector, which is defined as E(x,r EtXyY (b)...
(a)Suppose X ∼ Poisson(λ) and Y ∼ Poisson(γ) are independent, prove that X + Y ∼ Poisson(λ + γ). (b)Let X1, . . . , Xn be an iid random sample from Poisson(λ), provide a sufficient statistic for λ and justify your answer. (c)Under the setting of part (b), show λb = 1 n Pn i=1 Xi is consistent estimator of λ. (d)Use the Central Limit Theorem to find an asymptotic normal distribution for λb defined in part (c), justify...