Question

COMPUTER NETWORK AND CYBERSECURITY Q5 Explain the difference between single-bit errors and burst errors in error...

COMPUTER NETWORK AND CYBERSECURITY

Q5 Explain the difference between single-bit errors and burst errors in error control in communications systems. (3 marks)

(a) If a noise event causes a burst error to occur that lasts for 0.1 ms (millisecond) and data is being transmitted at 100Mbps, how many data bits will be affected? (3 marks)

(b) Under what circumstances is the use of parity bits an appropriate error control technique?

0 0
Add a comment Improve this question Transcribed image text
Answer #1

Question (5)

Single bit errors are errors which only affect isolated bits . Burst errors are errors which affect many bits within a block of bits.

a) If a burst error lasts for 0.1 ms i.e. 0.1x10-3 seconds. Hence for a transmission capacity of 100 Mbps i.e. 100 x 106 bits per seconds, in 0.1ms it will be able to affect (100 x 106)x(0.1x10-3) = 10,000 bits

b) The use of parity bits stands only appropriate if the expected bit errors are of type single bit errors.

.

Add a comment
Know the answer?
Add Answer to:
COMPUTER NETWORK AND CYBERSECURITY Q5 Explain the difference between single-bit errors and burst errors in error...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT