Let X1, ...., Xm be iid N(μ1,σ2) and Y1, ..., Yn be iid N(μ2,σ2), and X's and Y's are independent. Here -∞<μ1,μ2<∞ and 0<σ<∞ are unknown. Derive the MLE for (μ1,μ2,σ2). Is the MLE sufficient for (μ1,μ2,σ2)? Also derive the MLE for (μ1-μ2)/σ.
Let X1, ...., Xm be iid N(μ1,σ2) and Y1, ..., Yn be iid N(μ2,σ2), and X's...
,X, be iid N(μχ, σ*), Yi, ,Yn be iid N(Pv, σ*), and X's and Question 2: Let X1, Y's are independent. Let be the pooled variance. Show that Sg(0/n+1/m) is distributed at t with (n+m-2) degrees of freedom.
Problem 3. Consider two independent samples, X1, . . . , Xm from a N(µ1, σ12 ) distribution and Y1, . . . , Yn from a N(µ2, σ22 ) distribution. Here µ1, µ2, σ12 and σ2 are unknown. Consider testing the null hypothesis that the two population variance are equal, H0 : σ12 = σ22 , against the alternative that these variances are different, H1 : σ12 ≠ σ12 . (a) Derive the LR test statistic Λ
Suppose Y1, Y2, ... Yn are mutually independent random variables with Y1 ~ N(μ1, (σ1)^2) Y2 ~ N(μ2, (σ2)^2) ... Yn ~ N(μn, (σn)^2) Find the distribution of U=summation(from i=1 to n) ((Yi - μi)/σi)^2 I am not sure where should I start this question, could you please show me the detail that how you do these two parts? thanks :)
Please help with question 4 Consider the simple linear regression model: with σ2 is known. Assume x's are fixed and known, and only y's are random. Recall Ex 3.5.22 in Homework 1. Here the design matrix is 1 T2 and the regression coefficielt is β = (α, β)T, 3. Derive the MLE of a and ß and show that it is independent of σ2· Is your MLE sane as the least square estimation in Ex 3.5.22? 4. Drive the mean...
X1,... ,Xm and Yi,... .Yn are independent with common mean ξ and variances σ2 and T2, respectively, use (2.2.1) to show that the grand mean (X1+-...+Xm+Y + + Y,.) / (rn+ n) is a consis- , Yn are independent with common mean tent estimator of provided m + n 00. We were unable to transcribe this image
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
Let X1, . . . , Xn be i.i.d. from N(µ1, σ2 ), and Y1, . . . , Ym be i.i.d. from N(µ2, σ2 ). If the two samples are independent, find the maximum likelihood estimates for µ1, µ2, and the common variance σ 2 .
Let X1,...,Xn be iid N(μ,σ2) with known μ and unknown σ. For α in (0,1), obtain the UMP level α test for H0: σ=σ0 vs. H1: σ>σ0
Suppose that X1,X2, ,Xn are iid N(μ, σ2), where both parameters are unknown. Derive the likelihood ratio test (LRT) of Ho : σ2 < σ1 versus Ho : σ2 > σ.. (a) Argue that a LRT will reject Ho when w(x)S2 2 0 is large and find the critical value to confer a size α test. (b) Derive the power function of the LRT
Suppose X1,X2, ,Xm are iid exponential with mean A. Suppose Yı,Yo, exponential with mean β2-Suppose the samples are independent. , Yn are iid (a) Derive the likelihood ratio test (LRT) statistic λ(x,y) for testing versus and show that it is a function of ti-ti (x)-Σ-iz; and t2-t2(y)-Σ1Uj. (b) Show how you could perform a size a test in part (a) using the F distribution Suppose X1,X2, ,Xm are iid exponential with mean A. Suppose Yı,Yo, exponential with mean β2-Suppose the...