Let X1 , ... , Xn be n independent Bernoulli random variables.Let Y1 , ... , Yn be another n independent Bernoulli random variables. Let X = X 1 + · · · + Xn and Y = Y1 + · · · + Y,.,. Suppose that P(Xi = 1) 2: P(Yi = 1) for all i = 1, 2, ... , n. Does this guarantee that P(X > k) 2>P(Y> k) for all k = 1, 2, ... , n?
We need at least 10 more requests to produce the answer.
0 / 10 have requested this problem solution
The more requests, the faster the answer.
1. Let X1, ..., Xn, Y1, ..., Yn be mutually independent random variables, and Z = + Li-i XiYi. Suppose for each i E {1,...,n}, X; ~ Bernoulli(p), Y; ~ Binomial(n,p). What is Var[Z]?
I don't understand a iii and b ii, What's the procedure of deriving the limit distribution? Thanks. 6. Extreme values are of central importance in risk management and the following two questions provide the fundamental tool used in the extreme value theory. (a) Let Xi,... , Xn be independent identically distributed (i. i. d.) exp (1) random variables and define max(Xi,..., Xn) (i) Find the cumulative distribution of Zn (ii) Calculate the cumulative distribution of Vn -Zn - Inn (iii)...
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
Please answer all the parts neatly with all details. 4. Let X1, X2,... be independent random variables satisfying E(X4) < B for some finite B > 0 and E(Xn)-> . (a) Show that Y = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) < B, E(Y4 (b) Show that for Y, = (Yi +..+ Y)/n, 16B 16B ΣειβΥ< 6B 1 = n4 i=1 6 + n4 ij ΣΕ Υ) E(Y4) n'3 n2 P(Y > €) < oo and...
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Let X1, ..., Xn be a random sample from a population with pdf f(x 1/8,0 < x < θ, zero elsewhere. Let Yi < < Y, be the order statistics. Show that Y/Yn and Yn are independent random variables
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.
Happiness and Being in a Relationship Let X, Y be two Bernoulli random variables and let p = P(X = 1) (the probability that X = 1) q = P(Y = 1) (the probability that Y = 1) r = P(X = 1, Y = 1) (the probability that both X = 1 and Y = 1). Let (X1,Y1),...,(Xn, Yn) be a sample of n i.i.d. copies of (X,Y). Define and do not assumer = pq. = 3 X.Y.. We...
Let X1, X2, ..., Xn be independent Exp(2) distributed random vari- ables, and set Y1 = X(1), and Yk = X(k) – X(k-1), 2<k<n. Find the joint pdf of Yı,Y2, ...,Yn. Hint: Note that (Y1,Y2, ...,Yn) = g(X(1), X(2), ..., X(n)), where g is invertible and differentiable. Use the change of variable formula to derive the joint pdf of Y1, Y2, ...,Yn.
Please explain both questions. Show work. 7. Suppose X and Y are the random variables with joint PMF given by: 10.16 0.24 2 0.2 0.3 3 0.04 0.06 Are X and Y independent? (a) Yes Answer: Ta (b) No 8. Let X and Y be two random variables such that where Z is independent of X and Z N(0, ?2). A random sample of n 25 pairs (Xi, Yi),..., (Xn,Yn) resulted in the following statistics: X=-0.14; Y-5.04 x-x)2-220.11 (xi-X)Y, 100.23...