(a) Let X and Y be independent random variables both with the same mean u +0....
Suppose that X and Y are independent random variables with the same unknown mean u. Both X and Y have a variance of 36. Let T = aX + bY be an estimator of u. What condition must a and b satisfy in order that T be an unbiased estimator for ? Is T a normal random variable?
Let Yı, Y2, Ys, and Y4 be independent, identically distributed random variables from a mean u and a variance 02. Consider a different estimator of u: W=Y+Y2+2Y3+ Y 00 This is an example of a weighted average of the Y a) Show that W is a linear estimator. b) Is W an unbiased estimator of u? Show that it is - or it isn't (E(W) = Find the variance of W and compare it to the variance of the sample...
Let Y, Y2, Yz and Y4 be independent, identically distributed random variables from a population with mean u and variance o. Let Y = -(Y, + Y2 + Y3 +Y4) denote the average of these four random variables. i. What are the expected value and variance of 7 in terms of u and o? ii. Now consider a different estimator of u: W = y + y + y +Y4 This an example of weighted average of the Y. Show...
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
Suppose X~Pois(A) and Y ~Pois(2A) are independent random variables. Consider a linear estimator of λ, that is, λ = aX + bY. (a) Find an expression for the bias of λ, in terms of a and b, and determine a condition on the values of a and b, such that λ is unbiased. (b) Of all the values of a and b that make the estimator unbiased, find the values of oa and b that minimize the variance of the...
Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random variables with mhean μ and variance a) Compute the expected value of W b) For what value of a is the variance of W a minimum? σ: Let W-aX + (1-a) Y, where 0 < a < 1. Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random...
The random variables X and Y are independent with exponential densities fx (x) = e-"u(x) (a) Let Z = 2X + and w =-. Find the joint density of random variables Z and W (b) Find the density of random variable W (c) Find the density of random variable Z The random variables X and Y are independent with exponential densities fx (x) = e-"u(x) (a) Let Z = 2X + and w =-. Find the joint density of random...
Let X and Y be two independent Gaussian random variables with common variance σ2. The mean of X is m and Y is a zero-mean random variable. We define random variable V as V- VX2 +Y2. Show that: 0 <0 Where er cos "du is called the modified Bessel function of the first kind and zero order. The distribution of V is known as the Ricean distribution. Show that, in the special case of m 0, the Ricean distribution simplifies...
9. Let X and Y be independent and identically distributed random variables with mean u and variance o. Find the following: (a) E[(x + 2)] (b) Var(3x + 4) (c) E[(X-Y)] (d) Cov{(X + Y), (X - Y)}
Let X and Y be two independent random variables with X =d R(0, 2) and Y =d exp(1). (a) Use the convolution formula to calculate the probability density function of W =X+Y. (b) Derive the probability density function of U = XY .