As mean is given by :
Mean = (Summation i= 1 to n (Xi) )÷n
So (summation i = 1 to n (Xi)) = mean × n
So the LHS of above equation is :
(Mean × n) ÷ (sqrt (n) * log(n)) = (sqrt(n) × mean)/log(n)
As mean = 0, given
= (Sqrt (n) × 0)/ log(n) = 0
Suppose that Xi, X2, ..., Xn is an iid sample from where θ > 0. (a) Show that is a complete and sufficient statistic for σ (b) Prove that Y1-X11 follows an exponential distribution with mean σ (c) Find the uniformly minimum variance unbiased estimator (UMVUE) of T(o-o", where r is a fixed constant larger than 0.
Suppose Xi, X2, ,Xn is an iid N(μ, c2μ2 sample, where c2 is known. Let μ and μ denote the method of moments and maximum likelihood estimators of μ, respectively. (a) Show that ~ X and μ where ma = n-1 Σηι X? is the second sample (uncentered) moment. (b) Prove that both estimators μ and μ are consistent estimators. (c) Show that v n(μ-μ)-> N(0, σ ) and yM(^-μ)-+ N(0, σ ). Calculate σ and σ . Which estimator...
Suppose that Xi, X2,..., Xn is an iid sample from 20 for x R and σ 〉 0. (a) Derive a size α likelihood ratio test (LRT) of H0 : σ (b) Derive the power function β(o) of the LRT 1 versus H1 : σ 1.
Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of (b) Find the uniformly most powerful (UMP) test of versuS where θο is known. (c) Derive an expression for the power function of the test in part (b) Suppose that Xi, X2, ....Xn is an iid sample from where θ 0 is unknown. (a) Find the uniformly minimum variance unbiased estimator (UM VUE) of...
1. (Wasserman: Exercise 3.8.24) Let X1, X2,..., Xn be IID Exponential(B). Find the moment generating func- tion (MGF) of Xi. Prove that Σ¡_1 Xi ~ Gamma(n,d),
3. Let Xi, . . . , Xn be iid randoln variables with mean μ and variance σ2. Let, X denote the sample mean and V-Σ, (X,-X)2. (a) Derive the expected values of X and V. (b) Further suppose that Xi,-.,X, are normally distributed. Let Anxn ((a)) an orthogonal matrix whose first rOw 1S be , ..*) and iet Y = AX, where Y (Yİ, ,%), ard X-(XI, , X.), are (column) vectors. (It is not necessary to know aij...
, xn is an iid sample from fx(x10)-θe-8z1(x > 0), where θ > 0. Suppose X1, X2, For n 2 2, n- is the uniformly minimum variance unbiased estimator (UMVUE) of 0 (d) For this part only, suppose that n-1. If T(Xi) is an unbiased estimator of e, show that Pe(T(X) 0)>0
3. Let X1, . . . , Xn be iid random variables with mean μ and variance σ2. Let X denote the sample mean and V-Σ,(X,-X)2 a) Derive the expected values of X and V b) Further suppose that Xi,...,Xn are normally distributed. Let Anxn - ((a) be an orthogonal matrix whose first row is (mVm Y = (y, . . . ,%), and X = (Xi, , Xn), are (column) vectors. (It is not necessary to know aij for...
Suppose that Xi, X2,., Xn is an iid sample from (1- 0) In 0 0, X(T 0, herwise, where the parameter θ satisfies 0 θ 1. (a) Estimate θ using the method of moments (MOM) and using the method of maximum likelihood. Note: I am not sure if you can get closed form expressions for either estimator, but that is OK. Just write out the equation(s) that would need to be solved (numerically) to
Suppose that X = (Xi, X2, . . . , Xn) and Y = (y,Y2, . . . ,Yn) are random samples from continuous distributions F and G, respectively. Wilcoxon's two-sample test statistic W = W(X,Y) is defined to be Σ-ngi Ri where Ri is the rank of in the combined sample. 2 where U is the number of pairs (Xi,Y) with Xiくy, In other words n m U=ΣΣΊ, , where 1,,-ĺ0 otherwise. i,ji 3. Continuing from Question 2 show...