Let Xn be a discrete Markov chain with transition matrix P . Show that the
m-step transition probabilities are independent of the past.
Hint: it is clear for m=1, apply mathematical induction on m
Let Xn be a discrete Markov chain with transition matrix P . Show that the m-step...
Consider a two state Markov chain with one-step transition matrix on the states 1,21, , 0<p+q<2. 91-9 ' Show, by induction or otherwise, that the n-step transition matrix is Ptg -99 Based upon the above equation, what is lim-x P(Xn-2K-1). How about limn→x P(Xn-
Let Xn be a Markov chain with state space {0,1,2}, the initial probability vector and one step transition matrix a. Compute. b. Compute. 3. Let X be a Markov chain with state space {0,1,2}, the initial probability vector - and one step transition matrix pt 0 Compute P-1, X, = 0, x, - 2), P(X, = 0) b. Compute P( -1| X, = 2), P(X, = 0 | X, = 1) _ a. 3. Let X be a Markov chain...
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7 .3 .3 .4 .6 .3 .1 and initial probability vector a = [.2,.7,.1]. The P(X2=2) =
Let p E [0,1] with pメ, and let (Xn)n=o b l e the Markov chain on with initia [0,1] given by distribution δο and transition matrix 11: Z Z ify=x-1 p 0 otherwise. Use the strong law of large numbers to show that each state is transient. Hint: consider another Markov chain with additional structure but with the same distribution and transition matrix Let p E [0,1] with pメ, and let (Xn)n=o b l e the Markov chain on with...
Suppose Xn is a Markov chain on the state space S with transition probability p. Let Yn be an independent copy of the Markov chain with transition probability p, and define Zn := (Xn, Yn). a) Prove that Zn is a Markov chain on the state space S_hat := S × S with transition probability p_hat : S_hat × S_hat → [0, 1] given by p_hat((x1, y1), (x2, y2)) := p(x1, x2)p(y1, y2). b) Prove that if π is a...
1. Let Xn be a Markov chain with states S = {1, 2} and transition matrix ( 1/2 1/2 p= ( 1/3 2/3 (1) Compute P(X2 = 2|X0 = 1). (2) Compute P(T1 = n|Xo = 1) for n=1 and n > 2. (3) Compute P11 = P(T1 <0|Xo = 1). Is state 1 transient or recurrent? (4) Find the stationary distribution à for the Markov Chain Xn.
5. Let (Xn)n be a Markov chain on a state space S with n-step transition probabilities PTy = P(X,= y|Xo = x). Define (n) N x Xn=r n0 and U(G,) ΣΡ. n0 Show that (a) U(x, y)ENy|Xo= x] and (b) U(a, y) P(T, < +o0|X0= x)U(y, y), where Ty = inf {n 2 0 : X y}.
Let P be the transition probability matrix of a Markov chain. Show that if, for some positive integer r, Pr has all positive entries, then so does P", for all integers n 2 r
6. Suppose Xn is a two-state Markov chain with transition probabilities (Xn, Xn+1), n = 0, 1, 2, Write down the state space of the Markov chain Zo, Zi, . . . and determine the transition probability matrix.
. For a discrete time Markov chain with transition matrix P (py), explain what is meant by (1) global balance; (ii) local balance; (i) doubly stochastioc; (iv) birth-death Markov chain.