Question

Question 3 (12 points 3+5+ 4) not applicable to this years exam A memoryless source emits messages coming from the set M- mi

0 0
Add a comment Improve this question Transcribed image text
Answer #1

(a) As we know that entropy of coding can be obtained from the below formula:

H(z) = - P_{m_{1}}log(P_{m_{1}}) - P_{m_{2}}log(P_{m_{2}}) - P_{m_{3}}log(P_{m_{3}}) - P_{m_{4}}log(P_{m_{4}}) - P_{m_{5}}log(P_{m_{5}}) - P_{m_{6}}log(P_{m_{6}})

H(z)一一0.25log(0.25) _ 0.23log(0.23) _ 0.07109(0.07) _ 0.1 1 log(0.11) 0.24loq(0.24) 0.1loq(0.1

H(z) = 0·729bits

(b)

U S 6 O d1 o.25 仓01 Le a45 bits

(c) Coding efficiency is given by

H z)

0.729 2.45

η = 0.2975

Add a comment
Know the answer?
Add Answer to:
Question 3 (12 points 3+5+ 4) not applicable to this year's exam A memoryless source emits...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • A source emits eight messages with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, 1/128, and 1/12...

    please, solve ASAP. Thank you. A source emits eight messages with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, 1/128, and 1/128, respectively. Find the entropy of the source. Obtain the compact binary code and find the average length of the codeword. Determine the efficiency and the redundancy of the code. A source emits eight messages with probabilities 1/2, 1/4, 1/8, 1/16, 1/32, 1/64, 1/128, and 1/128, respectively. Find the entropy of the source. Obtain the compact binary code and find...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
Active Questions
ADVERTISEMENT