Solution:
For a Markov chain, consider a pair of states (i, j).
We say that j is reachable from i, denoted by i to j, if there exists an integer n ≥ 0 such that P n ij > 0.
This means that starting in state i, there is a positive probability (but not necessarily equal to 1) that the chain will be in state j at time n (that is, n steps later); P(Xn = j|X0 = i) > 0.
If j is reachable from i, and i is reachable from j, then the states i and j are said to communicate.
The relation defined by the Markov chain satisfies the following conditions:
1. All states communicate with themselves: P 0 ii = 1 > 0.1
2. Symmetry: If i = j, then j = i.
3. Transitivity: If i = k and k = j, then i = j.
The above conditions imply that communication is an example of an equivalence relation, meaning that it shares the properties with the more familiar equality relation “ = ”: i = i. If i = j, then j = i. If i = k and k = j, then i = j.
Only condition 3 above needs some justification, so we now prove it for completeness:
Suppose there exists integers n, m such that P n ik > 0 and P m kj > 0.
Letting l = n + m, we conclude that P l ij ≥ P n ikP m kj > 0 where we have formally used the Chapman-Kolmogorov equations.
A little thought reveals that this kind of disjoint breaking can be done with any Markov chain:
3. (5 points) Since the long-run proportion of time that a Markov chain spends in a transient sta...
Problem 7.4 (10 points) A Markov chain Xo, X1, X2,.. with state space S = {1,2,3,4} has the following transition graph 0.5 0.5 0.5 0.5 0.5 0.5 2 0.5 0.5 (a) Provide the transition matrix for the Markov chain (b) Determine all recurrent and all transient states (c) Determine all communication classes. Is the Markov chain irreducible? (d) Find the stationary distribution (e) Can you say something about the limiting distribution of this Markov chain? Problem 7.4 (10 points) A...
2. (10 points) Consider a continuous-time Markov chain with the transition rate matrix -4 2 2 Q 34 1 5 0 -5 (a) What is the expected amount of time spent in each state? (b) What is the transition probability matrix of the embedded discrete-time Markov chain? (c) Is this continuous-time Markov chain irreducible? (d) Compute the stationary distribution for the continuous-time Markov chain and the em- bedded discrete-time Markov chain and compare the two 2. (10 points) Consider a...
A4. Classify the states of the Markov chain with the following transition matrix. 0 3 0 1 Find the stationary distribution of each irreducible, recurrent subchain and hence obtain the mean recurrence time of each state. (8
2. Consider a Markov chain with state space S 1,2,3,4) with transition matrix 1/3 2/3 0 0 3/4 1/4 00 0 0 1/5 4/5 0 0 2/3 1/3, (a) (10 points) Is the Markov chain irreducible? Explain your answer ive three examples of stationary distributions.
Consider a Markov chain with state space S = {0, 1, 2, 3} and transition probability matrix P= (a) Starting from state 1, determine the mean time that the process spends in each transient state 1 and 2, separately, prior to absorption. (b) Determine the mean time to absorption starting from state 1. (c) Starting from state 1, determine the probability for the process to be absorbed in state 0. Which state is it then more likely for the process...
For the random walk of Example 4.18, use the strong law of large numbers to give another proof that the Markov chain is transient when p [Hint: Note that the state at time n can be written as Σίι Yi, where the Y's are independent and PO-1)-p-1-PY,--1). Argue that if p 〉 흘, then, by the strong law of large numbers. Ση-1 Ý, oo as n oo and hence the initial state 0 can be visited only finitely often, and...
Q4 and Q5 thanks! 4. Consider the Markov chain on S (1,2,3,4,5] running according to the transition probability matrix 1/3 1/3 0 1/3 0 0 1/2 0 0 1/2 P=10 0 1/43/40 0 0 1/2 1/2 0 0 1/2 0 0 1/2 (a) Find inn p k for j, k#1, 2, ,5 (b) If the chain starts in state 1, what is the expected number of times the chain -+00 spends in state 1? (including the starting point). (c) If...
A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2, 3, 4, 5} has transition probability matrix P. ain {x. " 0) with state spare S-(0 i 2.3.45) I as transition proba- bility matrix 01-α 0 0 1/32/3-3 β/2 0 β/2 0 β/2 β/21/2 0001-γ 0 0 0 0 (a) Determine the equivalence classes of communicating states for any possible choice of the three parameters α, β and γ; (b) In all cases, determine if...
1. Consider a Markov chain (X) where X E(1.2,3), with state transition matrix 1/2 1/3 1/6 0 1/4 (a) (6 points) Sketch the associated state transition diagram (b) (10 points) Suppose the Markov chain starts in state 1. What is the probability that it is in state 3 after two steps? (c) (10 points) Caleulate the steady-state distribution (s) for states 1, 2, and 3, respee- tively 1. Consider a Markov chain (X) where X E(1.2,3), with state transition matrix...
Question 6 [14 points]: Consider a Markov chain (XnJn2o with state space S 11,2,3,4) and transition kernel: 1/9 0 0 8/9 1/16 1/8 13/16 0 (a) List all the communication classes, and indicate if they are recurrent or visit state 1 and then come back to state 4. Compute E(T). What is the 1/40 0 3/4ノ transient. (b) 4 (initial state) and T the number of transitions required to Let distribution of T?