Question

Compress the given image using the Huffman encoding technique. Also compute the entropy of the image

120 | 101 | 192 | 120 120 | 123 | 192 123|192 | 101 | 105 | 120 105 | 120 | 101 120 | 105 105 | 120 | 120 | 101 | 192 105 | 1

0 0
Add a comment Improve this question Transcribed image text
Answer #1

step 1:

Image size is 6*6

Frequency:

                    frequency                  probablity(frequency/36)

symbol 101    9                                 0.25

symbol 105    6                                 0.17

symbol 120     10                                 0.27

symbol 123    3                                 0.08

symbol 192 8                                  0.23

       Source reduction(do till 2 values are left)( maintain sorted order) 1 2 3 Symbols(intensity level) Probabilities (sorted) 120      -

   so symbol 120 can be encoded as 00

   so symbol 101 can be encoded as 01

    so symbol 192 can be encoded as 11

so symbol 105 can be encoded as 100

so symbol 123 can be encoded as   101

Entropy=summation(-probability*(log2(probability))

Entropy of the image= -0.27*log2(0.27) -0.25*log2(0.25) -0.23*log2(0.23) -0.17*log2(0.17) -0.08*log2(0.08)

            = 2.22367        

Hope the solution is useful                       

Add a comment
Know the answer?
Add Answer to:
Compress the given image using the Huffman encoding technique. Also compute the entropy of the image...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT