Let {0,1,2} be the state space of hunt.
Here 0 represents that hunt is finished
1 represents that spider is in position 1 and fly is in position 2
2 represents that spider is in position 2 and fly is in position 1
Let Yn and Zn represent the markov chains of spider and fly.
Let Xn represent the markov chain of hunt.
3. A spider hunts a fly moving between the positions 1 and 2 according to a...
3. A spider hunts a fly moving between the positions 1 and 2 according to a Markov Chain with transition matrix 0.7 0.3 The fly, independently of the spider, moves between 1 and 2 according to a second Markov Chain whose transition matrix is 0.4 0.6 0.6 0.4 -(04) The hunt finishes the first time both the spider and the fly are on the same position, (a) Describe the hunt with a suitable 3 states Markov Chain (b) Assuming that...
) Draw a Markov diagram modelling the following A fly moves along a straight line in unit increments. At each time period', it moves one unit to the left with probability 0.3 ¢ to the right with probability 0.3 € stays at the same place with boobability 0.4, independently of the past history of movements. A Spideo is luoking at positions 1 & m=4, if the lands there it is captured & the process terminates. Assume that the fly starts...
An insurance company classifies its customers into three categories: 1. Good Risk 2. Acceptable Risk 3. Bad Risk Customers independently transition between categories at the end of each year according to a Markov process with the following transition matrix: P = 0.6 0.3 0.1 0.5 0.0 0.5 0.4 0.4 0.2 Find the long-run proportion of time in Good Risk, and the expected number of steps needed to return to Good Risk.
Let Xn be a Markov chain with state space {0, 1, 2}, and transition probability matrix and initial distribution π = (0.2, 0.5, 0.3). Calculate P(X1 = 2) and P(X3 = 2|X0 = 0) 0.3 0.1 0.6 p0.4 0.4 0.2 0.1 0.7 0.2
(n)," 2 0) be the two-state Markov chain on states (. i} with transition probability matrix 0.7 0.3 0.4 0.6 Find P(X(2) 0 and X(5) X() 0)
1. A Markov chain has transition matrix 2 3 1 0. 0.3 0.6 ll 2 0 0.4 0.6 l 3 I| 0.3 0.20.5 I| with initial distribution a(0.2,0.3, 0.5). Find the following (a) P(X, 31X62) (c) E(X2)
2. The Markov chain (Xn, n = 0,1, 2, ...) has state space S = {1, 2, 3, 4, 5} and transition matrix (0.2 0.8 0 0 0 0.3 0.7 0 0 0 P= 0 0.3 0.5 0.1 0.1 0.3 0 0.1 0.4 0.2 1 0 0 0 0 1 ) (a) Draw the transition diagram for this Markov chain.
1. A Markov chain {X,,n0 with state space S0,1,2 has transition probability matrix 0.1 0.3 0.6 P=10.5 0.2 0.3 0.4 0.2 0.4 If P(X0-0)-P(X0-1) evaluate P[X2< X4]. 0.4 and P 0-2) 0.2. find the distribution of X2 and
A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2} has transition probability matrix 0.1 0.3 0.6 p = 0.5 0.2 0.3 0.4 0.2 0.4 If P(X0 = 0) = P(X0 = 1) = 0.4 and P(X0 = 2) = 0.2, find the distribution of X2 and evaluate P[X2 < X4].
Consider the Markov chain with the following transition diagram. 1 0.5 0.5 0.5 0.5 0.5 2 3 0.5 (a) Write down the transition matrix of the Markov chain (b) Compute the two step transition matrix of the Markov chain 2 if the initial state distribution for 2 marks (c) What is the state distribution T2 for t t 0 is To(0.1, 0.5, 0.4)7? [3 marks (d) What is the average time 1.1 for the chain to return to state 1?...