1. Let X1, ..., Xn, Y1, ..., Yn be mutually independent random variables, and Z =...
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.
1. Let X1, X2, , Xn be independent Normal μ, σ2) random variables. Let y,-n Σ_lx, denote a sequence of random variables (a) Find E(y,) and Var(y,) for all n in terms of μ and σ2. (b) Find the PDF for Yn for alln. (c) Find the MGF for Yn for all n.
Let Y1, Y2, . . . , Yn be independent random variables with Exponential distribution with mean β. Let Y(n) = max(Y1,Y2,...,Yn) and Y(1) = min(Y1,Y2,...,Yn). Find the probability P(Y(1) > y1,Y(n) < yn).
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ) (c) Find var(Y(j) − Y(i)). Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0) Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0)
Let X1, X2, ..., Xn be independent Exp(2) distributed random vari- ables, and set Y1 = X(1), and Yk = X(k) – X(k-1), 2<k<n. Find the joint pdf of Yı,Y2, ...,Yn. Hint: Note that (Y1,Y2, ...,Yn) = g(X(1), X(2), ..., X(n)), where g is invertible and differentiable. Use the change of variable formula to derive the joint pdf of Y1, Y2, ...,Yn.
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
74. Let X1, X2, ... be a sequence of independent identically distributed contin- uous random variables. We say that a record occurs at time n if X > max(X1,..., Xn-1). That is, X, is a record if it is larger than each of X1, ... , Xn-1. Show (i) P{a record occurs at time n}=1/n; (ii) E[number of records by time n] = {}_1/i; (iii) Var(number of records by time n) = 2/_ (i - 1)/;2; (iv) Let N =...
Let Xi, X2, , xn be independent Normal(μ, σ*) random variables. Let Yn = n Ση1Xi denote a sequence of random variables (a) Find E(%) and Var(%) for all n in terms of μ and σ2. (b) Find the PDF for Yn for all n c) Find the MGF for Y for all n
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ). (a) Find the distribution of Y(n) and find its expected value. (b) Find the joint density function of Y(i) and Y(j) where 1 ≤ i < j ≤ n. Hence find Cov(Y(i) , Y(j)). (c) Find var(Y(j) − Y(i)). Let Yİ, Ya, , Yn be independent random variables each having uniform distribu- tion on the interval (0, 6) (a) Find the distribution...