Problem

Entropy. Consider a distribution over n possible outcomes, with probabilities P1, P2,...,...

Entropy. Consider a distribution over n possible outcomes, with probabilities P1, P2,..., Pn.

(a) Just for this part of the problem, assume that each pi is a power of 2 (that is, of the form 1/2k). Suppose a long sequence of m samples is drawn from the distribution and that for all 1 ≤in, the ith outcome occurs exactly mpi times in the sequence. Show that if Huffman encoding is applied to this sequence, the resulting encoding will have length


(b) Now consider arbitrary distributions—that is, the probabilities pi are not restricted to powers of 2. The most commonly used measure of the amount of randomness in the distribution is the entropy

For what distribution (over n outcomes) is the entropy the largest possible? The smallest possible?

Step-by-Step Solution

Request Professional Solution

Request Solution!

We need at least 10 more requests to produce the solution.

0 / 10 have requested this problem solution

The more requests, the faster the answer.

Request! (Login Required)


All students who have requested the solution will be notified once they are available.
Add your Solution
Textbook Solutions and Answers Search
Solutions For Problems in Chapter 5