Exercise 7.1 (Gamblers ruin). Let (Xt) 120 be the Gambler's chain on state space Ω =...
Suppose in the gambler's ruin problem that the probability of winning a bet de- pends on the gambler's present fortune. Specifically, suppose that ai is the prob- ability that the gambler wins a bet when his or her fortune is i. Given that the gambler's initial fortune is i, let P(i) denote the probability that the gambler's fortune reaches N before 0. (a) Derive a formula that relates Pi) to Pi -1 and Pi 1) (b) Using the same approach...
Problem 1.0 For the gamblers ruin problem, let Ma denote the mean number of games that must be played until the game ends (either the gambler goes broke or wins all the money) given that the gamble starts with d dollars, d0,..N. Recall that N is the total amount of money in the game, and using the Section 1.3.3 notation, (a) Show that Mo = MN = 0 and Md = 1 + pMy+1 +gMd-1 for d=1,2, A-1. (b) Use...
Exercise 5.10. Let P be the transition matrix of a Markov chain (Xt)120 on a finite state space Ω. Show that the following statements are equivalent: (i) P is irreducible and aperiodic (ii) There exists an integer r 0 such that for all i,je Ω, (88) (ii) There exists an integer r 20 such that every entry of Pr is positive.
Let P be the n*n transition matrix of a Markov chain with a finite state space S = {1, 2, ..., n}. Show that 7 is the stationary distribution of the Markov chain, i.e., P = , 2hTi = 1 if and only if (I – P+117) = 17 where I is the n*n identity matrix and 17 = [11...1) is a 1 * n row vector with all components being 1.
Suppose Xn is a Markov chain on the state space S with transition probability p. Let Yn be an independent copy of the Markov chain with transition probability p, and define Zn := (Xn, Yn). a) Prove that Zn is a Markov chain on the state space S_hat := S × S with transition probability p_hat : S_hat × S_hat → [0, 1] given by p_hat((x1, y1), (x2, y2)) := p(x1, x2)p(y1, y2). b) Prove that if π is a...
1. Exit times. Let X be a discrete-time Markov chain (with discrete state space) and suppose pii > 0. Let T =min{n 21: X i} be the exit time from state i. Show that T has a geometric distribution with respect to the conditional probability P 1. Exit times. Let X be a discrete-time Markov chain (with discrete state space) and suppose pii > 0. Let T =min{n 21: X i} be the exit time from state i. Show that...
Let α and β be positive constants. Consider a continuous-time Markov chain X(t) with state space S = {0, 1, 2} and jump rates q(i,i+1) = β for0≤i≤1 q(j,j−1) = α for1≤j≤2. Find the stationary probability distribution π = (π0, π1, π2) for this chain.