Let X=(X1,…,Xn)′ be the n×p data matrix, where Xi=(Xi1,…,Xip)′ is the ith observation. Let X¯=n−1∑ni=1Xi be the sample mean. Let sj1j2=1/n∑ni=1(Xij1−X¯j1)(Xij2−X¯j2) be the sample covariance between the j1th and j2th variables. Let S=(sj1j2) be the sample covariance matrix. Show that S=1nX′X−X¯′X¯.
Let X=(X1,…,Xn)′ be the n×p data matrix, where Xi=(Xi1,…,Xip)′ is the ith observation. Let X¯=n−1∑ni=1Xi be...
Let Xi, ..., Xn be random variables with the same mean and with covariance function where |ρ| < 1 . Find the mean and variance of Sn-Xi + . . . + Xn. Assume thatE(X. ) μ and V(X) σ2 for i (1.2. , n}
Please answer question (a) X1 - X X2 – Å a. Let X1, ..., Xn i.i.d. random variables with X; ~ N(u, o). Express the vector in the | Xn – form AX and find its mean and variance covariance matrix. Show some typical elements of the vari- ance covariance matrix. b. Refer to question (a). The sample variance is given by S2 = n11 21–1(X; – X)2, which can be ex- pressed as S2 = n1X'(I – 111')X (why?)....
Let Xi=I(treatment for ith patient is successful).Then Pr(Xi=1|P=p)=p. Suppose that conditionally P=p, X1,X2,...Xn are independent and X=sum of Xi (x from 1 to n). Suppose P~U(0,1) (uniform distribution), want to find the EX. Could you please show me the steps of this question? Thank you so much!
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
3. Let Xi, . . . , Xn be iid randoln variables with mean μ and variance σ2. Let, X denote the sample mean and V-Σ, (X,-X)2. (a) Derive the expected values of X and V. (b) Further suppose that Xi,-.,X, are normally distributed. Let Anxn ((a)) an orthogonal matrix whose first rOw 1S be , ..*) and iet Y = AX, where Y (Yİ, ,%), ard X-(XI, , X.), are (column) vectors. (It is not necessary to know aij...
Problem 4 Suppose X1, ..., Xn ~ f(x) independently. Let u = E(Xi) and o2 = Var(Xi). Let X Xi/n. (1) Calculate E(X) and Var(X) (2) Explain that X -> u as n -> co. What is the shape of the density of X? (3) Let XiBernoulli(p), calculate u and a2 in terms of p. (4) Continue from (3), explain that X is the frequency of heads. Calculate E(X) and Var(X). Explain that X -> p. What is the shape...
3. Let X1, . . . , Xn be iid random variables with mean μ and variance σ2. Let X denote the sample mean and V-Σ,(X,-X)2 a) Derive the expected values of X and V b) Further suppose that Xi,...,Xn are normally distributed. Let Anxn - ((a) be an orthogonal matrix whose first row is (mVm Y = (y, . . . ,%), and X = (Xi, , Xn), are (column) vectors. (It is not necessary to know aij for...
- Let {Xn} denote a sequence of iid random variables such that P(Xi = 1) = P(X1 = -1) = 1/2. Let Sn = X1 + X2 + ... + xn. (a) Find ES, and var(Sn); (b) Show that Sn is a martingale.
Let X1, X2, X3 … be independent random variable with P(Xi = 1) = p = 1-P(Xi=0), i ≥ 1. Define: N1 = min {n: X1+…+ Xn =5}, N2 = 3 if X1 = 0, 5 if X1 = 1. N3 = 3 if X4 = 0, 2 if X4 = 1. Which of the Ni are stopping times for the sequence X1, …?
Let X0,X1,... be a Markov chain whose state space is Z (the integers). Recall the Markov property: P(Xn = in | X0 = i0,X1 = i1,...,Xn−1 = in−1) = P(Xn = in | Xn−1 = in−1), ∀n, ∀it. Does the following always hold: P(Xn ≥0|X0 ≥0,X1 ≥0,...,Xn−1 ≥0)=P(Xn ≥0|Xn−1 ≥0) ? (Prove if “yes”, provide a counterexample if “no”) Let Xo,Xi, be a Markov chain whose state space is Z (the integers). Recall the Markov property: P(X,-'n l Xo-io, Xi...