Question

In information theory Shannon entropy is defined by H(x) = -Sum(P(x)*log(P(x)) where P is probability mass...

In information theory Shannon entropy is defined by H(x) = -Sum(P(x)*log(P(x)) where P is probability mass function of random variable x, and log to base 2. Given loaded die with P(6)=0.5 and P(1)=P(2)=P(3)=P(4)=P(5), compute entropy of observed rolls 654266. Note: To answer the question you do not need to know any more information about Shannon entropy.

0 0
Add a comment Improve this question Transcribed image text
Know the answer?
Add Answer to:
In information theory Shannon entropy is defined by H(x) = -Sum(P(x)*log(P(x)) where P is probability mass...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT