Let X1 and X2 be uniformaly distributed over the interval zero to one, (a) What is the joint probability density function of Y1 = X1+X2 and Y2 = X2-X1? (b) Are Y1 and Y2 independent? Explain.
Given X and Y are uniformly distributed between 0 and 1, so
So we use Jacobian to find the PDF for
from the PDF of
. The solution is as follows
Let X1 and X2 be uniformaly distributed over the interval zero to one, (a) What is...
Let (X1, Y1) and (X2, Y2) be independent and identically distributed continuous bivariate random variables with joint probability density function: fX,Y (x,y) = e-y, 0 <x<y< ; =0 , elsewhere. Evaluate P( X2>X1, Y2>Y1) + P (X2 <X1, Y2<Y1) .
Let X1, X2, ..., Xn be independent Exp(2) distributed random vari- ables, and set Y1 = X(1), and Yk = X(k) – X(k-1), 2<k<n. Find the joint pdf of Yı,Y2, ...,Yn. Hint: Note that (Y1,Y2, ...,Yn) = g(X(1), X(2), ..., X(n)), where g is invertible and differentiable. Use the change of variable formula to derive the joint pdf of Y1, Y2, ...,Yn.
= = 3, Cov(X1, X2) = 2, Cov(X2, X3) = -2, Let Var(X1) = Var(X3) = 2, Var(X2) Cov(X1, X3) = -1. i) Suppose Y1 = X1 - X2. Find Var(Y1). ii) Suppose Y2 = X1 – 2X2 – X3. Find Var(Y2) and Cov(Yı, Y2). Assuming that (X1, X2, X3) are multivariate normal, with mean 0 and covariances as specified above, find the joint density function fxı,Y,(y1, y2). iii) Suppose Y3 = X1 + X2 + X3. Compute the covariance...
Let X1,X2,...,Xn be an independent and identically distributed (i.i.d.) random sample of Beta distribution with parameters α = 2 and β = 1, i.e., with probability density function fX(x) = 2x for x ∈ (0,1). Find the probability density function of the first and last order statistics Y1 and Yn.
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
(a) Write down the joint pdf of X1 and X2. [4]
(b) By using the transformation of random variable method, find the joint pdf of
Y1 = X1 and Y2 = X2/X1. [16]
(c) Hence find the marginal pdfs of Y1 and Y2. [8]
(d) Compute the covariance between Y1 and Y2, cov [Y1, Y2]. [8]
(e) State, with justification, whether Y1 and Y2 are independent.
2. Let Z1 and Zo be independent standard normal random variables. Let! X= 221 +372 +12 X2 = 321 - 22 +11. (a) Find the joint density function of (X1, X2). (b) Find the covariance of X1 and X2. Now let Y1 = X1 + 4X2 +3 Y, = -2X2 +6X2 +5 (a) Find the joint density function of (Y1, Y). (b) Find the covariance of Yi and Y2.
Unif (0, 1) 5. Suppose U1 and U2 i= 1,2. Let X; = - log(1 - U;), i = 1,2. [0, 1], U are independent uniform random variables on (a) Show that X1 and X2 are independent exponential random variables with mean 1, X; ~ Еxp(1), і — 1,2. (b) Find the joint density function of Y1 = X1 + X2 and Y2 = X1/X2 and show that Y1 and Y2 are independent.
Unif (0, 1) 5. Suppose U1 and...
Let X1,X2 be two independent
exponential random variables with λ=1, compute the
P(X1+X2<t) using the joint density function. And let Z be gamma
random variable with parameters (2,1). Compute the probability that
P(Z < t). And what you can find by comparing P(X1+X2<t) and
P(Z < t)? And compare P(X1+X2+X3<t) Xi iid
(independent and identically distributed) ~Exp(1) and P(Z < t)
Z~Gamma(3,1) (You don’t have to compute)
(Hint: You can use the fact that Γ(2)=1,
Γ(3)=2)
Problem 2[10 points] Let...
Let X1 and X2 be independent exponential random variables with parameters λ1 and λ2respectively. Find the joint probability density function of X1 + X2 and X1 − X2.