Suppose that X1, X2 are two independent random variables with a common mean μ, but two different variances σ12 > σ22. Consider the family of estimators
Wα = αX1 + (1−α)X2
where 0 ≤ α ≤ 1
(a) Show that Wα is an unbiased estimator of μ for any value of α.
(b) Find the value of α which makes Wα as efficient as possible. Explain why the resulting formula makes sense.
Suppose that X1, X2 are two independent random variables with a common mean μ, but two...
Let X1 and X2 be independent random variables with mean μ and variance σ2. Suppose we have two estimators 1 (1) Are both estimators unbiased estimatros for θ? (2) Which is a better estimator?
. If X1, X2,..., Xn are independent random variables with common mean μ and variances σ1, σ2, . . ., σα , prove that Σί (Xi-T)2/[n(n-1)] is an ว. 102n unbiased estimate of var[X] 3. Suppose that in Exercise 2 the variances are known. LeTw Σί uiXi
Please solve these questions 1. Suppose that X1, X2, and Xs are random variables with common mean μ and variance matrix Find E(X1 +2X1X2-4X2X3 + X ]. 2. If X1, X2,..., X, are independent random variables with common mean (n - 1)] is an μ and variances σ?, σ2, .. ., σ unbiased estimate of varf , prove that Σ,(X,-X)2/[n 3. Suppose that in Exercise 2 the variances are known. Let X,-Σ,wa, be an unbiased estimate of μ (i.e., Σί...
Let X1, X2, X3, and X4 be a random sample of observations from a population with mean μ and variance σ2. The observations are independent because they were randomly drawn. Consider the following two point estimators of the population mean μ: 1 = 0.10 X1 + 0.40 X2 + 0.40 X3 + 0.10 X4 and 2 = 0.20 X1 + 0.30 X2 + 0.30 X3 + 0.20 X4 Which of the following statements is true? HINT: Use the definition of...
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
Let X1,, Xn be independent and identically distributed random variables with unknown mean μ and unknown variance σ2. It is given that the sample variance is an unbiased estimator of ơ2 Suggest why the estimator Xf -S2 might be proposed for estimating 2, justify your answer
Let X1,X2,...,Xn be iid exponential random variables with unknown mean β. (b) Find the maximum likelihood estimator of β. (c) Determine whether the maximum likelihood estimator is unbiased for β. (d) Find the mean squared error of the maximum likelihood estimator of β. (e) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (f) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (g) Determine the asymptotic distribution of the...
1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2, for this random sample. (b) Derive the maximum likelihood estimator (MLE) of μ and σ2 denoted μ and σ2, respectively. (c) Find the MLE of μ3 (d) Derive the method of moment estimator of μ and σ2, denoted μΜΟΜΕ and σ2MOME, respectively (e) Show that μ and...
Let X1, X2, X3 be independent random variables with E(X1) = 1, E(X2) = 2 and E(X3) = 3. Let Y = 3X1 − 2X2 + X3. Find E(Y ), Var(Y ) in the following examples. X1, X2, X3 are Poisson. [Recall that the variance of Poisson(λ) is λ.] X1, X2, X3 are normal, with respective variances σ12 = 1, σ2 = 3, σ32 = 5. Find P(0 ≤ Y ≤ 5). [Recall that any linear combination of independent normal...
Let X1 and X2 be independent random variables with means μ1 and μ2, and variances σ21 and σ22, respectively. Find the correlation of X1 and X1 + X2. Note that: The covariance of random variables X; Y is dened by Cov(X; Y ) = E[(X - E(X))(Y - E(Y ))]. The correlation of X; Y is dened by Corr(X; Y ) =Cov(X; Y ) / √ Var(X)Var(Y )