14. Let X1, X2, X3 be independent random variables that represent lifetimes (in hours) of three...
Let X1, X2, X3 be independent random variables with E(X1) = 1, E(X2) = 2 and E(X3) = 3. Let Y = 3X1 − 2X2 + X3. Find E(Y ), Var(Y ) in the following examples. X1, X2, X3 are Poisson. [Recall that the variance of Poisson(λ) is λ.] X1, X2, X3 are normal, with respective variances σ12 = 1, σ2 = 3, σ32 = 5. Find P(0 ≤ Y ≤ 5). [Recall that any linear combination of independent normal...
3. (25 pts.) Let X1, X2, X3 be independent random variables such that Xi~ Poisson (A), i 1,2,3. Let N = X1 + X2+X3. (a) What is the distribution of N? (b) Find the conditional distribution of (X1, X2, X3) | N. (c) Now let N, X1, X2, X3, be random variables such that N~ Poisson(A), (X1, X2, X3) | N Trinomial(N; pi,p2.ps) where pi+p2+p3 = 1. Find the unconditional distribution of (X1, X2, X3). 3. (25 pts.) Let X1,...
Exercise 6.48. Let X1, X2, ..., Xin be independent exponential random variables, with parameter lį for Xi. Let Y be the minimum of these random variables. Show that Y ~ Exp(11 +...+ In).
If X1, X2, and X3 are three independent Uniform random variables (Xi-Unif(0,1)) a) Use the convolution integral to find density function of Z-x1+X2+X3. b) What is E[Z]? independent Uniform random variables (Xi-Unifo,1): If X1, X2, and X3 are three independent Uniform random variables (Xi-Unif(0,1)) a) Use the convolution integral to find density function of Z-x1+X2+X3. b) What is E[Z]? independent Uniform random variables (Xi-Unifo,1):
4.) Let X1, X2 and X3 be independent uniform random variables on [0,1]. Write Y = X1 + X, and Z X2 + X3 a.) Compute E[X, X,X3]. (5 points) b.) Compute Var(x1). (5 points) c.) Compute and draw a graph of the density function fy (15 points)
11. Let X1, X2, X3 and X4 be independent lifetimes of memory chips. Suppose that Xi N(300, 102) for i = 1, 2, 3, 4 where the parameters are measured in hours. Compute the prob- ability that at least two of the four chips lasts at least 310 hours. (You may leave your answer in terms of an integral, in terms of, or you may leave your answer as an actual real number).
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
Let X1,X2 be two independent exponential random variables with λ=1, compute the P(X1+X2<t) using the joint density function. And let Z be gamma random variable with parameters (2,1). Compute the probability that P(Z < t). And what you can find by comparing P(X1+X2<t) and P(Z < t)? And compare P(X1+X2+X3<t) Xi iid (independent and identically distributed) ~Exp(1) and P(Z < t) Z~Gamma(3,1) (You don’t have to compute) (Hint: You can use the fact that Γ(2)=1, Γ(3)=2) Problem 2[10 points] Let...
Let X1, X2, and X3 be three independent, continuous random variables with the same distribution. Given X2 is smaller than X3, what is the conditional probability that X1 is smaller than X2?
3. Let {X1, X2, X3, X4} be independent, identically distributed random variables with p.d.f. f(0) = 2. o if 0<x< 1 else Find EY] where Y = min{X1, X2, X3, X4}.