Let X1, . . . , Xn be i.i.d. from N(µ1, σ2 ), and Y1, . . . , Ym be i.i.d. from N(µ2, σ2 ). If the two samples are independent, find the maximum likelihood estimates for µ1, µ2, and the common variance σ 2 .
It is given that and
The joint likelihood is
The loglikelihood is
Taking partial derivatives and equating to 0,
Similarly,
Problem 3. Consider two independent samples, X1, . . . , Xm from a N(µ1, σ12 ) distribution and Y1, . . . , Yn from a N(µ2, σ22 ) distribution. Here µ1, µ2, σ12 and σ2 are unknown. Consider testing the null hypothesis that the two population variance are equal, H0 : σ12 = σ22 , against the alternative that these variances are different, H1 : σ12 ≠ σ12 . (a) Derive the LR test statistic Λ
Let X1, ..., Xn be a random sample (i.i.d.) from a normal distribution with parameters µ, σ2 . (a) Find the maximum likelihood estimation of µ and σ 2 . (b) Compare your mle of µ and σ 2 with sample mean and sample variance. Are they the same?
Let X1, ..., Xn and Y1, ..., Ym be two independent samples from a Poisson dis- tribution with parameter 1. Let a, b be two positive numbers. Consider the following estimator for 1: i ,Y1 +...+Ym = a- X1 +...+Xn n т (a) What condition is needed on a and b so that û is unbiased? (b) What is the MSE of i?
Let X1, ..., Xn and Y1, ...,Ym be two independent samples from a Poisson dis- tribution with parameter 1. Let a,b be two positive numbers. Consider the following estimator for 1: i-X1 + ... + Xn+hY1 + ... + Ym m п (a) What condition is needed on a and b so that û is unbiased? (b) What is the MSE of Î?
Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands for independent and identically distributed.] Since X1, ..., Xn all have the same distribution, they have the same expected value and variance. Let E(X1) = µ and V ar(X1) = σ 2 . Find the following in terms of µ and σ 2 . (a) E(X2 1 ). Note this is not µ 2 ! (b) E( Pn i=1 X2 i /n). (c) Now, define W by W = 1...
2. Let X1,..., Xn be i.i.d. according to a normal distribution N(u,02). (a) Get a sufficient statistic for u. Show your work. (b) Find the maximum likelihood estimator for u. (c) Show that the MLE in part (b) is an unbiased estimator for u. (d) Using Basu's theorem, prove that your MLE from before and sº, the sample variance, are independent. (Hint: use W; = X1-0 and (n-1)32)
X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is known; show that ˆθ = 1 n Pn i=1(Xi− µ) 2 is the MLE for σ 2 and show that it is unbiased. Exactly 6.4-2. Xi, X2, . . . , xn i d. N(μ, μ)2 is the MLE for σ2 and show that it is unbiased. r'). Assume μ is known; show that θ- n Ση! (X,-
Let X1,..., Xn and Yi,..., Ym be two independent samples from a Poisson dis- tribution with parameter X. Let a, b be two positive numbers. Consider the following estimator for A: Y1 X1 Xn . Ym b n m (a) What condition is needed on a and b so that X is unbiased? (b) What is the MSE of A?
Let X1, X2, . . . , Xn be a random sample of size n from a normal population with mean µX and variance σ ^2 . Let Y1, Y2, . . . , Ym be a random sample of size m from a normal population with mean µY and variance σ ^2 . Also, assume that these two random samples are independent. It is desired to test the following hypotheses H0 : σX = σY versus H1 : σX...
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.