(i) Starting from 4 let the mean time spent in state 5 before absorptio be X. Also when in 5 let the mean time spent in 5 be Y before apsorption.
Now we have from state 4, the probability to get absorbed as the probability of going to any of the 4 states ( as all of them are absorbing )
1/16 + 1/16 + 1/4 + 1/4 = 1/8 + 1/2 = 5/8
Therefore, X = (5/8)*0 + (1/4)*X + (1/8)*(1 + Y)
(3/4)X = (1/8)*(1 + Y)
6X = 1 + Y
Also from state 5, we have here:
Abosrption probabilty = (1/6)*2 + (1/12)*2 = 1/2
Therefore Y = (1/2)*0 + (1/3)*X + (1/6)*(1 + Y)
Y = (X/3) + (1/6)*(1 + Y)
6Y = 2X + 1 + Y
5Y = 2X + 1
Putting Y = 6X - 1 in the above equation, we get here:
5(6X - 1) = 6X - 1
24X = 4
X = 1/6
Therefore the mean time spent in state 5 before absorption is given here as: 1/6
(ii) Starting from state 4, let the absorption probability into {2, 3} be X. Also starting from state 5 let the same probability be Y.
Then, from 4 we have here:
X = (1/8)*0 + (1/2)*1 + (1/4)*X + (1/8)*Y
8X = 4 + 2X + Y
Y = 6X - 4
Also from state 5, we have here:
Y = (1/3)*0 + (1/6)*1 + (1/3)*X + (1/6)*Y
6Y = 1 + 2X + Y
5Y = 1 + 2X
5(6X - 4) = 1 + 2X
28X = 1 + 20
X = 3/4 = 0.75
Therefore 0.75 is the required probability here.
(iii) From previous part, we saw X = 0.75, therefore
Y = 6X - 4 = 6*0.75 - 4 = 0.5
This is the probability that from state 5, it is abosrbed in {2, 3}
Therefore the probability of being absorbed in state {0, 1} from state 5 would be computed as: 1 - Y = 0.5
Therefore 0.5 is the required probability
here.
4 A Markov chain with state space {0, 1, 2, 3, 4, 5} has the following...
Consider a Markov chain with state space S = {0, 1, 2, 3} and transition probability matrix P= (a) Starting from state 1, determine the mean time that the process spends in each transient state 1 and 2, separately, prior to absorption. (b) Determine the mean time to absorption starting from state 1. (c) Starting from state 1, determine the probability for the process to be absorbed in state 0. Which state is it then more likely for the process...
Question 5 9 marks Consider a Markov chain {YTheN with state space S = {1,2,3,4), initial distribution Po (0.25,0.25, 0.5,0), and transition matrix 1/3 2/3 0 0 p 1/6 1/2 1/30 0 4/9 4/9 1/9 0 0 5/6 1/6 2(a) Find the equilibrium probability distribution T (b) Find the probability P(-1%-3. Ya-1). Question 5 9 marks Consider a Markov chain {YTheN with state space S = {1,2,3,4), initial distribution Po (0.25,0.25, 0.5,0), and transition matrix 1/3 2/3 0 0 p...
P is the (one-step) transition probability matrix of a Markov chain with state space {0, 1, 2, 3, 4 0.5 0.0 0.5 0.0 0.0 0.25 0.5 0.25 0.0 0.0 P=10.5 0.0 0.5 0.0 0.0 0.0 0.0 0.0 0.5 0.5 0.0 0.0 0.0 0.5 0.5/ (a) Draw a transition diagram. (b) Suppose the chain starts at time 0 in state 2. That is, Xo 2. Find E Xi (c)Suppose the chain starts at time 0 in any of the states with...
2. The Markov chain (Xn, n = 0,1, 2, ...) has state space S = {1, 2, 3, 4, 5} and transition matrix (0.2 0.8 0 0 0 0.3 0.7 0 0 0 P= 0 0.3 0.5 0.1 0.1 0.3 0 0.1 0.4 0.2 1 0 0 0 0 1 ) (a) Draw the transition diagram for this Markov chain.
Consider a Markov chain {YmJnEN with state space S ,2, 3, 4), initial distribution Po (0.25,0.25, 0.5,0), and transition matrix 1/3 2/3 00 1/6 1/2 1/3 0 0 4/9 4/9 1/9 0 0 5/6 1/6 (a) Find the equilibrium probability distribution π (b) Find the probability PO1 3. Уг 3.Ý, 1) Consider a Markov chain {YmJnEN with state space S ,2, 3, 4), initial distribution Po (0.25,0.25, 0.5,0), and transition matrix 1/3 2/3 00 1/6 1/2 1/3 0 0 4/9...
Consider the Markov chain with state space {0, 1,2} and transition matrix(a) Suppose Xo-0. Find the probability that X2 = 2. (b) Find the stationary distribution of the Markov chain
Consider a Markov chain with state space S = {1, 2, 3, 4} and transition matrix P= where (a) Draw a directed graph that represents the transition matrix for this Markov chain. (b) Compute the following probabilities: P(starting from state 1, the process reaches state 3 in exactly three time steps); P(starting from state 1, the process reaches state 3 in exactly four time steps); P(starting from state 1, the process reaches states higher than state 1 in exactly two...
A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2, 3, 4, 5} has transition probability matrix P. ain {x. " 0) with state spare S-(0 i 2.3.45) I as transition proba- bility matrix 01-α 0 0 1/32/3-3 β/2 0 β/2 0 β/2 β/21/2 0001-γ 0 0 0 0 (a) Determine the equivalence classes of communicating states for any possible choice of the three parameters α, β and γ; (b) In all cases, determine if...
2. Consider a Markov chain with state space S 1,2,3,4) with transition matrix 1/3 2/3 0 0 3/4 1/4 00 0 0 1/5 4/5 0 0 2/3 1/3, (a) (10 points) Is the Markov chain irreducible? Explain your answer ive three examples of stationary distributions.
Consider the Markov chain on state space {1,2, 3,4, 5, 6}. From 1 it goes to 2 or 3 equally likely. From 2 it goes back to 2. From 3 it goes to 1, 2, or 4 equally likely. From 4 the chain goes to 5 or 6 equally likely. From 5 it goes to 4 or 6 equally likely. From 6 it goes straight to 5. (a) What are the communicating classes? Which are recurrent and which are transient? What...