Let X1 and X2 be independent gamma distribution random variables with gamma (a1,1) and gamma (a2, 1). Find the marginal distributions of x1/(x1+x2) and x2/(x1+x2).
Let Y1 = X1/(X1 + X2) and Y2 = X1 + X2
We will use Jacobian method to calculate the joint distributions of x1/(x1+x2) and x1+x2.
Let X1 and X2 be independent gamma distribution random variables with gamma (a1,1) and gamma (a2,...
2. Let X1 and X2 be independent Poisson random variables with parameters λ1 and A2. Show that for every n 21, the conditional distribution of X1, given Xi X2n, is binomial, and find the parameters of this binomial distribution
4a). Let X1 and X2 be independent random variables with a common cumulative distribution function (i.e., c.d.f.) F(y) = { 0" if0cyotherwise。 Find the p.d. f. of X(2,-max(X, , xa). Are X(1)/X(2) and X(2) independent, where X(1,-min(X,, X2) ? 4a). Let X1 and X2 be independent random variables with a common cumulative distribution function (i.e., c.d.f.) F(y) = { 0" if0cyotherwise。 Find the p.d. f. of X(2,-max(X, , xa). Are X(1)/X(2) and X(2) independent, where X(1,-min(X,, X2) ?
3. (25 pts.) Let X1, X2, X3 be independent random variables such that Xi~ Poisson (A), i 1,2,3. Let N = X1 + X2+X3. (a) What is the distribution of N? (b) Find the conditional distribution of (X1, X2, X3) | N. (c) Now let N, X1, X2, X3, be random variables such that N~ Poisson(A), (X1, X2, X3) | N Trinomial(N; pi,p2.ps) where pi+p2+p3 = 1. Find the unconditional distribution of (X1, X2, X3). 3. (25 pts.) Let X1,...
(10 marks) Let X1, X2,... be a sequence of independent and identically distributed random variables with mean EX1 = i and VarX1 = a2. Let Yı, Y2, ... be another sequence of independent and identically distributed random variables with mean EY = u and VarY1 a2 Define the random variable ( ΣxΣ) 1 Dn 2ng2 i= i=1 Prove that Dn converges in distribution to a standard normal distribution, i.e., prove that 1 P(Dn ) dt 2T as n >oo for...
Let X1,X2 be two independent exponential random variables with λ=1, compute the P(X1+X2<t) using the joint density function. And let Z be gamma random variable with parameters (2,1). Compute the probability that P(Z < t). And what you can find by comparing P(X1+X2<t) and P(Z < t)? And compare P(X1+X2+X3<t) Xi iid (independent and identically distributed) ~Exp(1) and P(Z < t) Z~Gamma(3,1) (You don’t have to compute) (Hint: You can use the fact that Γ(2)=1, Γ(3)=2) Problem 2[10 points] Let...
Let X1, X2, and X3 be three independent, continuous random variables with the same distribution. Given X2 is smaller than X3, what is the conditional probability that X1 is smaller than X2?
Let X1, X2, ..., Xr be independent exponential random variables with parameter λ. a. Find the moment-generating function of Y = X1 + X2 + ... + Xr. b. What is the distribution of the random variable Y?
Let X1, X2, X3 be independent random variables with E(X1) = 1, E(X2) = 2 and E(X3) = 3. Let Y = 3X1 − 2X2 + X3. Find E(Y ), Var(Y ) in the following examples. X1, X2, X3 are Poisson. [Recall that the variance of Poisson(λ) is λ.] X1, X2, X3 are normal, with respective variances σ12 = 1, σ2 = 3, σ32 = 5. Find P(0 ≤ Y ≤ 5). [Recall that any linear combination of independent normal...
Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof using the de finition of distribution function that the the distribution function of Z =Y Xit(1-Y)X2 is F = pF14(1-p)F2 Don't use generatinq moment functions, characteristic functions) Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter...
Let X1 and X2 be random variables, not necessarily independent. Show that E [X1 + X2] = E [X1] + E [X2]. You may assume that X1 and X2 are discrete with a joint probability mass function for this problem, while the above inequality is true also for continuous random variables.