5. Let X n 2 0} be a Markov chain with state space S = {0,1,2,...}....
Let Xn be a Markov chain with state space {0,1,2}, the initial
probability vector and one step transition matrix
a. Compute.
b. Compute.
3. Let X be a Markov chain with state space {0,1,2}, the initial probability vector - and one step transition matrix pt 0 Compute P-1, X, = 0, x, - 2), P(X, = 0) b. Compute P( -1| X, = 2), P(X, = 0 | X, = 1) _ a.
3. Let X be a Markov chain...
(10 points) Consider a Markov chain (Xn)n-0,1,2 probability matrix with state space S ,2,3) and transition 1/5 3/5 1/5 P-0 1/2 1/2 3/10 7/10 0 The initial distribution is given by (1/2,1/6,1/3). Compute (a) P[X2-k for all k- 1,2,3 (b) E[X2] Does the distribution of X2 computed in (a) depend on the initial distribution a? Does the expected value of X2 computed in (b) depend on the nitial distribution a? Give a reason for both of your answers.
5. Let (Xn)n be a Markov chain on a state space S with n-step transition probabilities PTy = P(X,= y|Xo = x). Define (n) N x Xn=r n0 and U(G,) ΣΡ. n0 Show that (a) U(x, y)ENy|Xo= x] and (b) U(a, y) P(T, < +o0|X0= x)U(y, y), where Ty = inf {n 2 0 : X y}.
Suppose that {Xn} is a Markov chain with state space S = {1, 2},
transition matrix (1/5 4/5 2/5 3/5), and initial distribution P (X0
= 1) = 3/4 and P (X0 = 2) = 1/4. Compute the following:
(a) P(X3 =1|X1 =2)
(b) P(X3 =1|X2 =1,X1 =1,X0 =2)
(c) P(X2 =2)
(d) P(X0 =1,X2 =1)
(15 points) Suppose that {Xn} is a Markov chain with state space S = 1,2), transition matrix and initial distribution P(X0-1-3/4 and P(Xo,-2-1/4. Compute...
Suppose Xn is a Markov chain on the state space S with transition probability p. Let Yn be an independent copy of the Markov chain with transition probability p, and define Zn := (Xn, Yn). a) Prove that Zn is a Markov chain on the state space S_hat := S × S with transition probability p_hat : S_hat × S_hat → [0, 1] given by p_hat((x1, y1), (x2, y2)) := p(x1, x2)p(y1, y2). b) Prove that if π is a...
Consider the Markov chain with state space S = {0,1,2,...} and transition probabilities I p, j=i+1 pſi,j) = { q, j=0 10, otherwise where p,q> 0 and p+q = 1.1 This example was discussed in class a few lectures ago; it counts the lengths of runs of heads in a sequence of independent coin tosses. 1) Show that the chain is irreducible.2 2) Find P.(To =n) for n=1,2,...3 What is the name of this distribution? 3) Is the chain recurrent?...
1. Let {Xn, n 2 0 be a Markov Chain with state space S. Show that for any n, m-1 and JAn+m, . . . , İn+1,in-1 , . . . , io є S.
Q5. Consider a Markov chain {Xn|n ≥ 0} with state space S = {0, 1, · · · } and transition matrix (pij ). Find (in terms QA for appropriate A) P{ max 0≤k≤n Xk ≤ m|X0 = i} . Q6. (Flexible Manufacturing System). Consider a machine which can produce three types of parts. Let Xn denote the state of the machine in the nth time period [n, n + 1) which takes values in {0, 1, 2, 3}. Here...
A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2} has transition probability matrix 0.1 0.3 0.6 p = 0.5 0.2 0.3 0.4 0.2 0.4 If P(X0 = 0) = P(X0 = 1) = 0.4 and P(X0 = 2) = 0.2, find the distribution of X2 and evaluate P[X2 < X4].