Given that
Let X1,X2,........Xn be a ransom variable with a common
mean .
. If X1, X2,..., Xn are independent random variables with common mean μ and variances σ1, σ2, . . ., σα , prove that Σί (Xi-T)2/[n(n-1)] is an ว. 102n unbiased estimate of var[X] 3. Suppose that in Exercise 2 the variances are known. LeTw Σί uiXi
3. Let X1, . . . , Xn be iid random variables with mean μ and variance σ2. Let X denote the sample mean and V-Σ,(X,-X)2 a) Derive the expected values of X and V b) Further suppose that Xi,...,Xn are normally distributed. Let Anxn - ((a) be an orthogonal matrix whose first row is (mVm Y = (y, . . . ,%), and X = (Xi, , Xn), are (column) vectors. (It is not necessary to know aij for...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
1 [3]. Let X1,X2, X3 be iid random variables with the common mean --1 2-4 and variance σ Find (a) E (2X1 - 3X2 + 4X3); (b) Var(2X1 -4X2); (c) Cov(Xi - X2, X1 +2X2).
Let Xi, ..., Xn be random variables with the same mean and with covariance function where |ρ| < 1 . Find the mean and variance of Sn-Xi + . . . + Xn. Assume thatE(X. ) μ and V(X) σ2 for i (1.2. , n}
ULLL Dsu i8 an unbiased estimate of Umin 4. The random variables X1, X2, . . . .xn have a common nonzero mean μ, a common variance σ2, and the correlation between any pair of random variables is ρ. (a) Find var (b) If and hence prove that-1/(n-1) バ1. 71 仁1 is an unbiased estimate of σ2, find a and b. Hence show that, i this case,
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
1. Let X1, X2, , Xn be independent Normal μ, σ2) random variables. Let y,-n Σ_lx, denote a sequence of random variables (a) Find E(y,) and Var(y,) for all n in terms of μ and σ2. (b) Find the PDF for Yn for alln. (c) Find the MGF for Yn for all n.
Central Limit Theorem: let x1,x2,...,xn be I.I.D. random variables with E(xi)= U Var(xi)= (sigma)^2 defind Z= x1+x2+...+xn the distribution of Z converges to a gaussian distribution P(Z<=z)=1-Q((z-Uz)/(sigma)^2) Use MATLAB to prove the central limit theorem. To achieve this, you will need to generate N random variables (I.I.D. with the distribution of your choice) and show that the distribution of the sum approaches a Guassian distribution. Plot the distribution and matlab code. Hint: you may find the hist() function helpful
Let X1, X2, ... be independent continuous random variables with a common distribution function F and density f. For k > 1, let Nk = min{n>k: Xn = kth largest of X1, ... , Xn} (a) Show Pr(Nx = n) = min-1),n>k. (b) Argue that fxx, (a) = f(x)+(a)k-( ++2)(F(x)* (c) Prove the following identity: al= (+*+ 2) (1 – a)', a € (0,1), # 22. i