Question

Suppose the joint probability distribution of two binary random variables X and Y are given as...

Suppose the joint probability distribution of two binary random variables X and Y are given as follows.

x/y 1 2
0 3/10 0
1 4/10 3/10

X goes along side as 0 and 1, Y goes along top as 1 and 2.

a) Show the marginal distribution of X.

b) Find entropy H(Y ).

c) Find conditional entropy H(X|Y ) and H(Y |X).

d) Find mutual information I(X; Y ).

e) Find joint entropy H(X, Y ).

f) Suppose X and Y are independent. Show that H(X|Y ) = H(X).

g) Suppose X and Y are independent. Show that H(X, Y ) = H(X) + H(Y ).

h) Show that I(X; X) = H(X).

0 0
Add a comment Improve this question Transcribed image text
Answer #1
x/y 1 2 totalP(X=x)
0 0.3 0 0.3
1 0.4 0.3 0.7
total=P(Y=y) 0.7 0.3 1
a)marginal distribution of X
X P(X=x)
0 0.3
1 0.7
total 1

b)

Entropy: H(Y) = − Σ p(y).log p(y)

= -[0.7log20.7+0.3log20.3] = 0.88129

c)

Y 1 2
P(Y|X=0)=P(X=x,Y=y)/P(X=0) 0.3/0.3=1 0
P(Y|X=1)=P(X=x,Y=y)/P(X=1) .4/.7=4/7 .3/.7=3/7

H(Y|X) = − p (x, y)log p(y|x) ] across all x ∈ X and y ∈ Y

= -[0.3log1+0log0+0.4log(4/7)+0.3log(3/7)]= 0.6897

H(X|Y) = − p (x, y)log p(x|y) ] across all x ∈ X and y ∈ Y

= -[0.3log(0.3/0.7)+0log(0/0.3)+0.4log(0.4/0.7)+0.3log(0.3/0.3)]= 0.6897

d)

mutual information I(X; Y )

I(X; Y) = H(X) - H(X|Y) = − Σ p(x).log p(x) - H(X|Y) = -[0.3log0.3+0.7log0.3]-0.6897 = 0.1916

---------------------------------------------------------------

please note that only first four subparts per post can be answered as per HOMEWORKLIB RULES

Add a comment
Know the answer?
Add Answer to:
Suppose the joint probability distribution of two binary random variables X and Y are given as...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • Suppose the joint probability distribution of two binary random variables X and Y are given as...

    Suppose the joint probability distribution of two binary random variables X and Y are given as follows. X/Y 1 0 1 2 1 4 0 + 1 (a) Show the marginal distribution of X. [2pts] (b) Find entropy H(Y). [2pts] (e) Find conditional entropy H(XY). (3pts] (d) Find mutual information I(X;Y). [3pts] 2 (e) Find joint entropy H(X,Y). (3pts) Note: The following three proofs are not related to the example in parts (a - e). You need to prove each...

  • Suppose that the following table is the joint probability distribution of two random variables X and...

    Suppose that the following table is the joint probability distribution of two random variables X and Y: х -2 0 2 3 0.27 0.08 0.16 0.2 0.1 0.04 0.1 0.05 a. Find the marginal PDF of X when x=-2, 0, 2, and 3. b. Find the marginal PDF of Y when y=2 and 5. . Find the conditional PDF of x=-2 and 3 given that y=2 has occurred. . Find the conditional PDF of y=2 and 5 given that x=3...

  • . Suppose we have the following joint distribution for random variables X and Y 2 0.1...

    . Suppose we have the following joint distribution for random variables X and Y 2 0.1 0.2 0.1 4 0 0.3 0.1 6 0 0 0.2 (a) Find p(X). That is find the marginal distribution of X. (b) Find p(Y). That is find the marginal distribution of Y (c) Find the distribution of X conditional on Y = 3. (d) Find the distribution of X conditional on Y 2 (e) Are X and Y independent? You should be able to...

  • 1. Suppose you have two random variables, X and Y with joint distribution given by the...

    1. Suppose you have two random variables, X and Y with joint distribution given by the following tables So, for example, the probability that Y o,x - 0 is 4, and the probability that Y (a) Find the marginal distributions (pmfs) of X and Y, denoted f(x),J(Y). (b) Find the conditional distribution (pmf) of Y give X, denoted f(YX). (c) Find the expected values of X and Y, EX), E(Y). (d) Find the variances of X and Y, Var(X),Var(Y). (e)...

  • The joint probability density function for continuous random variables X and Y is given below. f...

    The joint probability density function for continuous random variables X and Y is given below. f (x) =   x + y, 0 < x < 1, 0 < y < 1 if;                                                                          0,         degilse. (a) Show that this is a joint density function. (b) Find the marginal density of X . (c) Find the marginal density of Y . (d) Given Y = y find the conditional density of X . (e) P ( 1/2 < X < 1|Y =...

  • The joint probability density function for continuous random variables X and Y is given below. f...

    The joint probability density function for continuous random variables X and Y is given below. f (x) =   x + y, 0 < x < 1, 0 < y < 1 if;                                                                          0,         degilse. (a) Show that this is a joint density function. (b) Find the marginal density of X . (c) Find the marginal density of Y . (d) Given Y = y find the conditional density of X . (e) P ( 1/2 < X < 1|Y =...

  • The joint probability density function for continuous random variables X and Y is given below. f...

    The joint probability density function for continuous random variables X and Y is given below. f (x) =   x + y, 0 < x < 1, 0 < y < 1 if;                                                                          0,         degilse. (a) Show that this is a joint density function. (b) Find the marginal density of X . (c) Find the marginal density of Y . (d) Given Y = y find the conditional density of X . (e) P ( 1/2 < X < 1|Y =...

  • Suppose hat the joint probability distribution of the continuous random variables X and Y is cons...

    Suppose hat the joint probability distribution of the continuous random variables X and Y is constant on the rectangle 0 < x < a and 0 < y < b for a, b E R+. Show mathematically that X and Y are independent. Hint: (a) Recall JDx "lly f(r, y) dy dx-1 (b) Recall X, Y are independent if ffy fry Suppose hat the joint probability distribution of the continuous random variables X and Y is constant on the rectangle...

  • Two random variables, X and Y, have joint probability density function f ( x , y...

    Two random variables, X and Y, have joint probability density function f ( x , y ) = {  c , x < y < x + 1 , 0 < x < 1 0 , o t h e r w i s e Find c value. What's the conditional p.d.f of Y given X = x, i.e., f Y ∣ X = x ( y ) ? Don't forget the support of Y. Find the conditional expectation E [...

  • The discrete random variables X and Y take integer values with joint probability distribution given by...

    The discrete random variables X and Y take integer values with joint probability distribution given by f (x,y) = a(y−x+1) 0 ≤ x ≤ y ≤ 2 or =0 otherwise, where a is a constant. 1 Tabulate the distribution and show that a = 0.1. 2 Find the marginal distributions of X and Y. 3 Calculate Cov(X,Y). 4 State, giving a reason, whether X and Y are independent. 5 Calculate E(Y|X = 1).

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT