8. Let {Xn, n = 1, 2, . . . } and (, , n = 1, 2, . . . } be two sequences of random variables, defined on the sample space Suppose that we know . Xn → X, G.8 Prove that XnYX+Y. 8. Let {Xn, n...
1. Let X, X1, X2, ... be random variables defined on the same space. Assume that Xn + X. Assume further that there is a random variable Y with E[Y] < o such that P(|Xn] <Y) = 1 for each n. Show that lim E[Xn] = E[X]. n-
Consider two sequences of random variables X1, X2,... and Y1, Y., .... Suppose that Xn converges to a and Y, converges to b with probability 1. Show that X, + Y, converges to a+b, with probability 1. Next, we assume that the random variables Y cannot be equal to zero, show that X/Y, converges to a/b with probability 1.
1. Let {y,)%, be a sequence of random variables, and let Y be a random variable on the same sample space. Let A(E) be the event that Y - Y e. It can be shown that a sufficient condition for Y, to converge to Y w.p.1 as n → oo is that for every e0, (a) Let {Xbe independent uniformly distributed random variables on [0, 1] , and let Yn = min (X), , X,). In class, we showed that...
(Sums of normal random variables) Let X be independent random variables where XN N(2,5) and Y ~ N(5,9) (we use the notation N (?, ?. ) ). Let W 3X-2Y + 1. (a) Compute E(W) and Var(W) (b) It is known that the sum of independent normal distributions is n Compute P(W 6)
Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn
Suppose that X, Y, and Z are jointly distributed random variables, that is, they are defined on the same sample space. Suppose that we also have the following. E(X)-8 E(Y)-7 E(Z)-2 Var (x) 24 Var (Y) 2 Var (z) 29 Compute the values of the expressions below. E (5x- 4) Var (-2 5z) - [D
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
the set of compactly supported sequences is defined by c00 = {{xn} : there exists some N ≥ 0 so that xn = 0 for all n ≥ N } Prove that for 1 ≤ p ≤ ∞ the metric space (c00, dp) is not complete.
Let Y and X be two random variables. Let g(X) be any function of X used to predict Y. Finally, let the Minimum Mean Squared Error Prediction (MMSE) problem be defined as: min E[(Y g(X)) g(X) Prove that E(Y|X) is the solution to the MMSE problem, that is to say: E[Y - E(YX)) E[{Y - g(X)) Let Y and X be two random variables. Let g(X) be any function of X used to predict Y. Finally, let the Minimum Mean...
3. Let X1, . . . , Xn be iid random variables with mean μ and variance σ2. Let X denote the sample mean and V-Σ,(X,-X)2 a) Derive the expected values of X and V b) Further suppose that Xi,...,Xn are normally distributed. Let Anxn - ((a) be an orthogonal matrix whose first row is (mVm Y = (y, . . . ,%), and X = (Xi, , Xn), are (column) vectors. (It is not necessary to know aij for...