1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2,...
1. (40) Suppose that X1, X2, .. , Xn, forms an normal distribution with mean /u and variance o2, both unknown: independent and identically distributed sample from 2. 1 f(ru,02) x < 00, -00 < u < 00, o20 - 00 27TO2 (a) Derive the sample variance, S2, for this random sample (b) Derive the maximum likelihood estimator (MLE) of u and o2, denoted fi and o2, respectively (c) Find the MLE of 2 (d) Derive the method of moment...
Let X1,, Xn be independent and identically distributed random variables with unknown mean μ and unknown variance σ2. It is given that the sample variance is an unbiased estimator of ơ2 Suggest why the estimator Xf -S2 might be proposed for estimating 2, justify your answer
Suppose that X1,X2, ,Xn are iid N(μ, σ2), where both parameters are unknown. Derive the likelihood ratio test (LRT) of Ho : σ2 < σ1 versus Ho : σ2 > σ.. (a) Argue that a LRT will reject Ho when w(x)S2 2 0 is large and find the critical value to confer a size α test. (b) Derive the power function of the LRT
Suppose that X1, ..., Xn is a random sample from a normal distribution with mean μ and variance σ2. Two unbiased estimators of σ2 are 1?n 1 i=1 σˆ12 =S2 = n−1 Find the efficiency of σˆ12 relative to σˆ2. (Xi −X̄)2, and σˆ2= 2(X1 −X2)2
is taken from N(μ, σ2), where the mean 2. A randorn sample X1, X2, , xn of size μ is a known real num ber. Show that the m axim urn likelihood estimator for σ2 is ớmle n Σ.i(Xi μ)2 and that this estimator is an unbiased estinator of σ2. (I lint: Σ.JX _ μ)-g. Σ.i My L and Σ. (Xcpl, follows X2(n))
Suppose you have a sample of n independent observations X1,X2,...,Xn from a normal population with mean μ (known) and variance σ2 (unknown). (a) Find the ML estimator of σ2 . (b) Show that the ML estimator in (a) is a consistent estimator of θ. (c) Find a sufficient statistic for σ2. (d) Give a MVUE for θ based on the sufficient statistic.
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
4. Let X1,X2, ,Xn be a randonn sample from N(μ, σ2) distribution, and let s* Ση! (Xi-X)2 and S2-n-T Ση#1 (Xi-X)2 be the estimators of σ2 (i) Show that the MSE of s is smaller than the MSE of S2 (ii) Find E [VS2] and suggest an unbiased estimator of σ.
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale. 3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
Question 6 Let X1, . . . , Xn denote a sequence of independent and identically distributed i.id. N(14x, σ2) random variables, and let Yı, . . . , Yrn denote an independent sequence of iid. Nụy, σ2) ran- dom variables. il Λί and Y is an unbiased estimator of μ for any value of λ in the unit interval, i.e. 0 < λ < 1. 2. Verify that the variance of this estimator is minimised when and determine the...