Suppose three random variables X, Y, Z have a joint distribution
PX,Y,Z(x,y,z)=PX(x)PZ∣X(z∣x)PY∣Z(y∣z). |
Then X and Y are independent given Z? True or False
Suppose random variables X and Y are independent given Z , then the joint distribution must be of the form
PX,Y,Z(x,y,z)=h(x,z)g(y,z), |
where h,g are some functions? True or false
1.
False
As the joint distribution includes term PZ|X(z|x) that means distribution of Z is dependent on X and also PY|Z(y|z) that means Y is dependent on Z, so by transitivity, Y is dependent on X hence they are not independent given Z.
2.
True.
The X and Y are dependent on Z only. And the joint pdf can be written in form of product x|z and y|z.
Suppose three random variables X, Y, Z have a joint distribution PX,Y,Z(x,y,z)=PX(x)PZ∣X(z∣x)PY∣Z(y∣z). Then X and Y...
(a) Suppose that X, Y and Z are random variables whose joint distribution is continuous with density fxyz. Write down appropriate definitions of of (i) fxyz, density of the joint distribution of X and Y given Z, and (ii) fxyz, density of the distribution of X given both Y and Z. Assuming the expectations exist, prove the tower property: E[E[X|Y, 2]|2] = E[X|2], by expressing both sides using the densities you have defined. Suppose that X and Y are independent...
Suppose X, Y and Z are three different random variables. Let X obey Bernoulli Distribution. The probability distribution function is p(x) = Let Y obeys the standard Normal (Gaussian) distribution, which can be written as Y ∼ N(0, 1). X and Y are independent. Meanwhile, let Z = XY . (a) What is the Expectation (mean value) of X? (b) Are Y and Z independent? (Just clarify, do not need to prove) (c) Show that Z is also a standard...
Consider the following joint probability distribution on the random variables X and Y given in matrix form by Pxy P11 P12 P13 PXY-IP21 p22 p23 P31 P32 P33 P41 P42 P43 HereP(i, j) P(X = z n Y-J)-Pu represents the probability that X-1 and Y = j So for example, in the previous problem, X and Y represented the random variables for the color ([Black, Red]) and utensil type (Pencil,Pe pblackpen P(X = Black Y = Pen) = P(Black n...
15. Problem 15. Show that if pxy (r.v) -Px ()py () for any (r,y) E x x y (independent random variables) then: EIXY-EX] E[Y: factorazibility of crpectation values; b) sex.r-sx)+s(): aditinity of entropy Note that pxy (r, y) denotes the probability density function of the joint random variable (x, Y), while px (a) and py (u) are the marginal probability density functions of and Y, respectively. The Shannon eatropy (messured in units of nats) of the joint system (X. Y)...
PLEASE MAKE YOUR HAND WRITING CLEAR AND READABLE . THANK YOU! O Let X and Y be independent random variables with a discrete uniform distribution, i.e., with probability mass functions for k = 1, px(k) = py (k) =-, N. Use the addition rule for discrete random variables on page 152 to determine the probability mass function of Z -X+Y for the following two cases. a. Suppose N = 6, so that X and Y represent two throws with a...
10. Let the random variables X ~ NGIX, σ%) and Y ~ Nuy,ơ be jointly continious normal random variables. Now suppose their joint pdf is X and Y are said to have a bivariate normal distribution (a) Given this joint pdf, show that X and Y are independent. (b) The most general form of the pdf for a bivariate normal distribution is What must be true about k for X and Y to be independent bivariate normal random variables? 10....
If the joint probability distribution of three discrete random variables X, Y , and Z is given by: f(x, y, z) = (x + y)z / 63 , for x = 1, 2; y = 1, 2, 3; z = 1, 2. Find the probability P(X = 2, Y + Z ≤ 3)
Suppose the joint probability distribution of two binary random variables X and Y are given as follows. X/Y 1 0 1 2 1 4 0 + 1 (a) Show the marginal distribution of X. [2pts] (b) Find entropy H(Y). [2pts] (e) Find conditional entropy H(XY). (3pts] (d) Find mutual information I(X;Y). [3pts] 2 (e) Find joint entropy H(X,Y). (3pts) Note: The following three proofs are not related to the example in parts (a - e). You need to prove each...
Suppose the joint probability distribution of two binary random variables X and Y are given as follows. x/y 1 2 0 3/10 0 1 4/10 3/10 X goes along side as 0 and 1, Y goes along top as 1 and 2. a) Show the marginal distribution of X. b) Find entropy H(Y ). c) Find conditional entropy H(X|Y ) and H(Y |X). d) Find mutual information I(X; Y ). e) Find joint entropy H(X, Y ). f) Suppose X...
1) Suppose that three random variables, X, Y, and Z have a continuous joint probability density function f(x, y. z) elsewhere a) Determine the value of the constant b) Find the marginal joint p. d. fof X and Y, namely f(x, y) (3 Points) c) Using part b), compute the conditional probability of Z given X and Y. That is, find f (Z I x y) d) Using the result from part c), compute P(Z<0.5 x - 3 Points) 2...