Please upvote if you like my answer
Thank you
2. Let X ..Xn be continuous random variables. Assume that Show that X1, ..., Xn are...
1. Let X.Xn be discrete random variables. Assume that Пр Show that X1, Xn are independent.
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Let X1, X2, , xn are independent random variables where E(X)-? and Var(X) ?2 for all i = 1, 2, , n. Let X-24-xitx2+--+Xy variables. is the average of those random Find E(X) and Var(X).
Let X1, ...., Xn be independent random variables with X; ~ N(lli, 02). Let Q=[(Xı – M1)2 + ... +(Xn – Mn)2]. Find E(Q) as a function of o and n.
Let X1, X2, ... be independent continuous random variables with a common distribution function F and density f. For k > 1, let Nk = min{n>k: Xn = kth largest of X1, ... , Xn} (a) Show Pr(Nx = n) = min-1),n>k. (b) Argue that fxx, (a) = f(x)+(a)k-( ++2)(F(x)* (c) Prove the following identity: al= (+*+ 2) (1 – a)', a € (0,1), # 22. i
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
Let X1, , X2 ... be a sequence of independent and identically distributed continuous random variables. Say that a peak occurs at time n if Xn-1 < Xn < Xn+1 . Argue that the proportion of time that a peak occurs is, with probability 1, equal to 1/3
1. Let X, X1, X2, ... be random variables defined on the same space. Assume that Xn + X. Assume further that there is a random variable Y with E[Y] < o such that P(|Xn] <Y) = 1 for each n. Show that lim E[Xn] = E[X]. n-
Let X1, ..., Xn be a random sample from a population with pdf f(x 1/8,0 < x < θ, zero elsewhere. Let Yi < < Y, be the order statistics. Show that Y/Yn and Yn are independent random variables
Let X1 and X2 be random variables, not necessarily independent. Show that E [X1 + X2] = E [X1] + E [X2]. You may assume that X1 and X2 are discrete with a joint probability mass function for this problem, while the above inequality is true also for continuous random variables.