Question

Consider a (48, 36) block error correcting code used in the downlink control channel of a...

Consider a (48, 36) block error correcting code used in the downlink control channel of a cellular system. This code can correct up to 5 bit errors in a block of 48 bits.

If the 48 bit blocks are buffered until 10 blocks are collected and then they are interleaved and transmitted, what is the maximum error burst size that can be tolerated and still have correct reception of the data?

What is the delay incurred in buffering the data for interleaving and de-­interleaving at a 48 kbps rate?

0 0
Add a comment Improve this question Transcribed image text
Answer #1

First Question:

The Burst error detection theorem states that the maximum burst error that can be detected is n - k.

According to the question n = 48 and k = 36

So, the maximum detectable burst error = (48 - 36) = 12

The theorem of burst error correction states that the maximum number of burst errors that can be corrected for any (n, k) block of code is l <= n - k - logq(n - l) + 2

We have n = 48, k = 36 and l = 5.

For the given scenario l <= 48 - 36 - log(48 - 5) + 2

or, l<= 14 - 5.2

or, l <= 8.7

So, the possible value is 8.

Hope this helps.

Add a comment
Know the answer?
Add Answer to:
Consider a (48, 36) block error correcting code used in the downlink control channel of a...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT