Problem

A transition probability matrix is said to be doubly stochastic if  for all stales j= 0, 1...

A transition probability matrix is said to be doubly stochastic if

 

for all stales j= 0, 1, ..., M. Show that such a Markov chain is ergodic, then Пj= 1 /(M + 1),  j = 0,1,..., M.

Step-by-Step Solution

Request Professional Solution

Request Solution!

We need at least 10 more requests to produce the solution.

0 / 10 have requested this problem solution

The more requests, the faster the answer.

Request! (Login Required)


All students who have requested the solution will be notified once they are available.
Add your Solution
Textbook Solutions and Answers Search
Solutions For Problems in Chapter 9