3. Let Xi, . . . , Xn be iid randoln variables with mean μ and variance σ2. Let, X denote the sample mean and V-Σ, (X,-X)2. (a) Derive the expected values of X and V. (b) Further suppose that Xi,-.,X, are normally distributed. Let Anxn ((a)) an orthogonal matrix whose first rOw 1S be , ..*) and iet Y = AX, where Y (Yİ, ,%), ard X-(XI, , X.), are (column) vectors. (It is not necessary to know aij...
Let Xi, . . . , Xn be IID random variables with mean μ, standard deviation σ and finite fourth moment. Prove by induction the identity
Let Xi,..., Xn be iid random variables with distribution Bern(p) (a) Is the statistic 름 Σ. ? (b) Is the statistic (Σ¡X 2? Xi an unbiased estimator of p i) an unbiased estimator of p Let Xi,..., Xn be iid random variables with distribution Bern(p) (a) Is the statistic 름 Σ. ? (b) Is the statistic (Σ¡X 2? Xi an unbiased estimator of p i) an unbiased estimator of p
Let X1,…, Xn be a sample of iid Bin(1, ?) random variables, and let T = X(1 − X) be an estimator of Var(Xi ) = ?(1 − ?). Determine E(T). Bias(T; ?(1 − ?)).
Suppose X1,. , Xn are iid Poisson(A) random variables. Show by direct calculation without using any theoremm in mathematical statistics, that (a) Ση! Xi/n is an unbiased estimator for λ. (b) X is optimal in MSE among all unbiased estimators. This is to say, let T be another unbiased estimator, then EA(X) EA(T2
7. Let X1 and X2 be two iid exp(A) random variables. Set Yi Xi - X2 and Y2 X + X2. Determine the joint pdf of Y and Y2, identify the marginal distributions of Yi and Y2, and decide whether or not Yi and Y2 are independent [10)
Let {Xn} be a sequence of iid random variables 1. (20 points) Let {Xn} be a sequence of iid random variables with common pdf f(x) = - =e-x2/2,x ER. Then find the limit in probability of the sequence of random variables {Y} where Yo: 31x11. i=1
Exercice 5. Let Xi, ,Xn be iid normal randon variables : Xi ~ N(μ, σ2). We denote 4 Tl Show that (İ) ils2 (i.e., that x is independent of 82). (ii) x ~ N(μ, σ2/n). (iii) !뷰 ~ เลี้-1
4. Let X1,X2, ,Xn be a randonn sample from N(μ, σ2) distribution, and let s* Ση! (Xi-X)2 and S2-n-T Ση#1 (Xi-X)2 be the estimators of σ2 (i) Show that the MSE of s is smaller than the MSE of S2 (ii) Find E [VS2] and suggest an unbiased estimator of σ.
3. Let X1, X2, . . . , Xn be random variables with a common mean μ. Sup- pose that cov[Xi, xj] = 0 for all i and A such that j > i+1. If 仁1 and 6 VECTORS OF RANDOM VARIABLES prove that = var X n(n- 3)