Problem

Entropy. The Shannon entropy measures the information content of an input string and plays...

Entropy. The Shannon entropy measures the information content of an input string and plays a cornerstone role in information theory and data compression. Given a string of n characters, let fc be the frequency of occurrence of character c. The quantity Pc = fc/n an estimate of the probability that c would be in the string if it were a random string, and the entropy is defined to be the sum of the quantity –pc log2 pc, over all characters that appear in the string. The entropy is said to measure the information content of a string: if each character appears the same number times, the entropy is at its minimum value among strings of a given length. Write a program that takes the name of a file as a command-line argument and prints the entropy of the text in that file. Run your program on a web page that you read regularly, a recent paper that you wrote, and the fruit fly genome found on the website.

Step-by-Step Solution

Request Professional Solution

Request Solution!

We need at least 10 more requests to produce the solution.

0 / 10 have requested this problem solution

The more requests, the faster the answer.

Request! (Login Required)


All students who have requested the solution will be notified once they are available.
Add your Solution
Textbook Solutions and Answers Search
Solutions For Problems in Chapter 3.1