In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(...
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
σ2). 6. Suppose X1, Yİ, X2, Y2, , Xn, Y, are independent rv's with Xi and Y both N(μ, All parameters μί, 1-1, ,n, and σ2 are unknown. For example, Xi and Yi muay be repeated measurements on a laboratory specimen from the ith individual, with μί representing the amount of some antigen in the specimen; the measuring instrument is inaccurate, with normally distributed errors with constant variability. Let Z, X/V2. (a) Consider the estimate σ2- (b) Show that the...
Let X1,..., Xn and Yi,..., Ym be two independent samples from a Poisson dis- tribution with parameter X. Let a, b be two positive numbers. Consider the following estimator for A: Y1 X1 Xn . Ym b n m (a) What condition is needed on a and b so that X is unbiased? (b) What is the MSE of A?
5. We have two independent samples of n observations X1, X2,... , Xn and Yi, Y2,..., Y, We want to test the hypothesis Ho : μ®-,ty versus the alternative H, : μ*-t ,ty. (a) First, assume that the null hypothesis Ho is true and find the MLE for μ-Ae-μΥ. (b) Then plug this estimate into the log likelihood along with the MLE's μΧ-x and My to calculate the LRT statistic. (c) Is this likelihood ratio test equivalent to the test...
8. Let X, X2, , xn all be be distributed Normal(μ, σ2). Let X1, X2, , xn be mu- tually independent. a) Find the distribution of U-Σǐ! Xi for positive integer m < n b) Find the distribution of Z2 where Z = M Hint: Can the solution from problem #2 be applied here for specific values of a and b?
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
1. Let X1, . . . , Xn be a sample of size n from a distribution with expectation μ (2X1 + X2 + . . . + Xn-1 + 2Xn)/(n+1)l be an estimator and variance σ . and let μ- for μ. Is it unbiased? asymptotically unbiased? consistent?
is taken from N(μ, σ2), where the mean 2. A randorn sample X1, X2, , xn of size μ is a known real num ber. Show that the m axim urn likelihood estimator for σ2 is ớmle n Σ.i(Xi μ)2 and that this estimator is an unbiased estinator of σ2. (I lint: Σ.JX _ μ)-g. Σ.i My L and Σ. (Xcpl, follows X2(n))
1. Let X1, X2, , Xn be independent Normal μ, σ2) random variables. Let y,-n Σ_lx, denote a sequence of random variables (a) Find E(y,) and Var(y,) for all n in terms of μ and σ2. (b) Find the PDF for Yn for alln. (c) Find the MGF for Yn for all n.
3. Let X1, X2, . . . , Xn be independent samples of a random variable with the probability density function (PDF): fX(x) = θ(x − 1/ 2 ) + 1, 0 ≤ x ≤ 1 ,0 otherwise where θ ∈ [−2, 2] is an unknown parameter. We define the estimator ˆθn = 12X − 6 to estimate θ. (a) Is ˆθn an unbiased estimator of θ? (b) Is ˆθn a consistent estimator of θ? (c) Find the mean squared...