1. Let X.Xn be discrete random variables. Assume that Пр Show that X1, Xn are independent.
2. Let X ..Xn be continuous random variables. Assume that Show that X1, ..., Xn are independent.
Let X1 and X2 be random variables, not necessarily independent. Show that E [X1 + X2] = E [X1] + E [X2]. You may assume that X1 and X2 are discrete with a joint probability mass function for this problem, while the above inequality is true also for continuous random variables.
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
Let X1,... , Xn be independent random variables, each following an exponential distri- bution with rate λ. Let Y = min(X1, .. . , Xn). Find the cd.f. and pdf. of Y. HINT:
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
3. (a) (5 points) Let Xi,... be a sequence of independent identically distributed random variables e of tnduqendent idente onm the interval (o, 1] and let Compute the (almost surely) limit of Yn (b) (5 points) Let X1, X2,... be independent randon variables such that Xn is a discrete random variable uniform on the set {1, 2, . . . , n + 1]. Let Yn = min(X1,X2, . . . , Xn} be the smallest value among Xj,Xn. Show...
1. Let X1, ·s, Xn be independent random variables taking values 0 or 1 withP(Xi=1)=eθ-ai /(1+eθ-ai ), i=1, ……, nfor some given constants ai. Find a one-dimensional sufficient statistic for θ.
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, ...., Xn be independent random variables with X; ~ N(lli, 02). Let Q=[(Xı – M1)2 + ... +(Xn – Mn)2]. Find E(Q) as a function of o and n.