Question

Markov Chains Consider the Markov chain with transition matrix P = [ 0 1 1 0]....

Markov Chains

Consider the Markov chain with transition matrix P = [ 0 1

1 0].

1) Compute several powers of P by hand. What do you notice?

2) Argue that a Markov chain with P as its transition matrix cannot stabilize unless both initial probabilities are 1/2.

0 0
Add a comment Improve this question Transcribed image text
Answer #1

0 1 1 0 P =

1) Observe that P^2 =I, hence the order of P is 2. Even powers of P are I and odd powers of P is P.

2) The Markov chain stabilizes when T

We need to find T P2 such that \pi (P-I) = 0 and p_1+p_2=1

-1 1 T 1 0 -1

Hence the solution is P P. But p_1+p_2=1, therefore P1 P2 = 1/2 is the only solution.

Add a comment
Know the answer?
Add Answer to:
Markov Chains Consider the Markov chain with transition matrix P = [ 0 1 1 0]....
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT