2. Let X1 and X2 be the numbers showing when two fair dice are thrown. Define...
2. Let X1 and X2 be the numbers showing when two fair dice are thrown. Define new random variables X = Xi-X2 and Y = X1 + X2. Show that X and Y are uncorrelated but not independent. Hint: To show lack of independence, it is enough to show that PlX = j, Y = k]メPIX = j] . PY = k] for one pair (j, k); try the pair (0.2).]
Additional Problem 2. Two fair dice are thrown. Let Xi and X2 denote the outcomes pint and Cdl of Λ.
5. Let X1 and X2 be two independent standard normal random variables. Define two new random variables as follows: Yı = X1 + X2 and ½ = X1 + ßX2. You are not given the constant β but it is known that Cov(Yi,Y) = 0. Find (a) the density of Y2 (b) Cov(Xy½),
Let X1 and X2 be two independent standard normal random variables. Define two new random variables as follows: Y-Xi X2 and Y2- XiBX2. You are not given the constant B but it is known that Cov(Yi, Y2)-0. Find (a) the density of Y (b) Cov(X2, Y2)
A random experiment consists of throwing two three-sided dice (show- ing the numbers 1, 2, 3). Let Y be the random variable which records the product of the pair of numbers showing on the dice. (i) Write down the range RY of Y . (ii) Determine the probability distribution of Y . (iii) Calculate E(Y ) and V (Y ).
Let X1 d= R(0,1) and X2 d= Bernoulli(1/3) be two independent random variables, define Y := X1 + X2 and U := X1X2. (a) Find the state space of Y and derive the cdf FY and pdf fY of Y . (You may wish to use {X2 = i}, i = 0,1, as a partition and apply the total probability formula.) (b) Compute the mean and variance of Y in two different ways, one is through the pdf of Y...
2. Assume two fair dice are rolled. Let X be the number showing on the first die and number showing on the second die. (a) Construct the matrix showing the joint probability mass function of the pair X,Y. (b) The pairs inside the matrix corresponding to a fixed value of X - Y form a straight line of entries inside the matrix. Draw those lines and use them to construct the probability mass function of the random variable X-Y- make...
Let X1 and X2 be two independent continuous random variables. Define and S-Ixpo+2xso) where Ry and R2 are the Wilcoxon signed ranks of X, and X2, respectively. (a) Assume that X, and X2 have symmetric distributions about 0. Show that Pr(T ) Pr(S-t) for 0,1,2,3 using the properties of symmetry:-Xi ~ x, and Pr(X, > 0)-Pr(X, <0) = 0.5 (b) Suppose that X1 and X2 are identically distributed with common density -05%:- 10.5sx <0 0.5 0sxs1 show that Pr(T+-): Pr(S...
(2) Given two independent variables X1 and X2 having Bernoulli distribution with parameter p=1/3, let Y1 = 2X1 and Y2 = 2X2. Then A E[Y1 · Y2] = 2/9 BE[Y1 · Y2] = 4/9 C P[Y1 · Y2 = 0) = 1/9 D P[Y1 · Y2 = 0) = 2/9 (3) Let X and Y be two independent random variables having gaussian (normal) distribution with mean 0 and variance equal 2. Then: A P[X +Y > 2] > 0.5 B...