4. Let ,, , xn be independent and suppose that E(X.) k,0 + bi, for known...
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...
please answer with full soultion. with explantion. (4 points) Let Xi, , Xn denote a randon sample from a Normal N(μ, 1) distribution, with 11 as the unknown parameter. Let X denote the sample mean. (Note that the mean and the variance of a normal N(μ, σ2) distribution is μ and σ2, respectively.) Is X2 an unbiased estimator for 112? Explain your answer. (Hint: Recall the fornula E(X2) (E(X)Var(X) and apply this formula for X - be careful on the...
Let X be a random variable with cdf FX (x:0), expected value EIX-μ and variance VlX- σ2. Let X1,X2, , Xn be an id sample drawn according to FX(x,8) where Fx (x,8) =万 for all x E (0,0). Let max(X1, X2, , X.) be an estimator of θ, suggested from pure common sense. Remember that if Y = max(X1, X2, , Xn). Then it can be shown that the cdf Fy () of Y is given by Fr(u) (Fx()" where...
7. Show that σ2 E(X-0 and Var(X if X1, . . . , Xn are independent and identically distributed with E(Xi) = 0 and E(X2) = σ2 for i = 1,-.. , n
, xn is an iid sample from fx(x10)-θe-8z1(x > 0), where θ > 0. Suppose X1, X2, For n 2 2, n- is the uniformly minimum variance unbiased estimator (UMVUE) of 0 (d) For this part only, suppose that n-1. If T(Xi) is an unbiased estimator of e, show that Pe(T(X) 0)>0
4. Let X1,X2, ,Xn be a randonn sample from N(μ, σ2) distribution, and let s* Ση! (Xi-X)2 and S2-n-T Ση#1 (Xi-X)2 be the estimators of σ2 (i) Show that the MSE of s is smaller than the MSE of S2 (ii) Find E [VS2] and suggest an unbiased estimator of σ.
5. Let X1, X2, . .. , Xn be independently distributed as N(μ, σ2). Define 7t n-1 ー1 Users/rumi3/Downloads/Linear-Regression-Analysis-Seber.pdf MOMENT GENERATING FUNCTIONS AND INDEPENDENCE 13 and n-1 2(n-1 i=1 (a) Prove that var[S2-2c4/(n-1). b) Show that Q is an unbiased estimate ofa (c) Find the variance of Q and hence show that as n → oo, the effi- ciency of Q relative to S2 is