Question

A two-stage Markov chain consists of ON and OFF. When the system is ON it is...

A two-stage Markov chain consists of ON and OFF. When the system is ON it is always turned off. When the system is OFF it is always turned on.

a) Draw the Markov Chain

b) Create the transition matrix, call it P.

c) Find P2

d) Find P3

e) Considering what P2 and P3 are, explain to me like I am an idiot why the Limit Distribution can't exist.

f) Find the Stationary Distribution

0 0
Add a comment Improve this question Transcribed image text
Answer #1

f)So the stationary distribution is (1/2, 1/2)

Add a comment
Know the answer?
Add Answer to:
A two-stage Markov chain consists of ON and OFF. When the system is ON it is...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT