Question

1. Use this exercise to convince yourself that using different probabilities, the same discrete time chain may produce different stationary discrete time Markov chains with different transition matrices (we only consider two probabilities here in this problem; there are many other proba- bilities that can be chosen for which the process is not stationary or does not satisfy the Markov property). Consider two states 0 or 1 which a process (Xt) moves between. At each time step, t-0, 1, 2, 3, , a coin is flipped; at time t 0, if the coin is heads then the process starts in state 1, or if the coin is tails the process starts in state 0; for t 2 1, if the coin is heads, then the process stays in whichever state it is currently in, or if the coin is tails, then the process switches states (a) Suppose that we assign the underlying probability IP being one which would consider the coin to be fair and each flip independent. Give a brief justification why (X, 0 is a stationary discrete time Markov chain with respect to this probability P, and find the transition matrix P for the process (b) Suppose that we assign the underlying probability P being one which would consider the coin to be biased such that the probability of flipping a heads is 1/3; also suppose that with respect to this probability the coin flips are independent. Give a brief justification why (Xt)000 is a stationary discrete time Markov chain with respect to this probability P, and find the transition matrix P for the process.

The answer is one of the following:

Answers (in random order): 0.6,-2,-1,0, 1,2),5/36, 19/64,15/17.1/3 1-p p 1-p 0 1 00 0 1-p 0 114 0 3/4 1/21/2), 2/31/3). 0 1-p 0 p 0 1-p p 0 010

Please be descriptive!!

0 0
Add a comment Improve this question Transcribed image text
Know the answer?
Add Answer to:
The answer is one of the following: Please be descriptive!! 1. Use this exercise to convince...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • The answer is one of the following: Please be descriptive!! Thank you :) 2. Suppose that...

    The answer is one of the following: Please be descriptive!! Thank you :) 2. Suppose that {Y;R 1 are iid random variables such that PW = 1-p and PĢ--1-1-p Define the process (Xn)n-0 by the following recursive relationship Xo = 0 and for n 21. Show that (a) (Xn)2 is a stationary discrete time Markov chain, (b) Find its state space S, and (c) Calculate its transition matrix P (making sure the entries in P are ordered consistently with the...

  • 2. (10 points) Consider a continuous-time Markov chain with the transition rate matrix -4 2 2 Q 3...

    2. (10 points) Consider a continuous-time Markov chain with the transition rate matrix -4 2 2 Q 34 1 5 0 -5 (a) What is the expected amount of time spent in each state? (b) What is the transition probability matrix of the embedded discrete-time Markov chain? (c) Is this continuous-time Markov chain irreducible? (d) Compute the stationary distribution for the continuous-time Markov chain and the em- bedded discrete-time Markov chain and compare the two 2. (10 points) Consider a...

  • The answer is one of the following: Please be descriptive! Thank you! 3. A Markov chain...

    The answer is one of the following: Please be descriptive! Thank you! 3. A Markov chain with state space S 1,2,3) has transition matrix 0.1 0.3 0.6 P-0 0.4 0.6 0.3 0.2 0.5 with initial distribution(0.2,0.3,0.5). Note that if you use the Markov Property, please indicate where you used it. Compute the following: (b) P(Xo 3| X1 1) Answers (in random order): 0.6,-2,-1,0, 1,2),5/36, 19/64,15/17.1/3 1-p p 1-p 0 1 00 0 1-p 0 114 0 3/4 1/21/2), 2/31/3). 0...

  • The answer is one of the following: Please be descriptive! Thank you! 5. Let Xo, X1,......

    The answer is one of the following: Please be descriptive! Thank you! 5. Let Xo, X1,... be a Markov chain with state space S- 11,2,3] and transition matrix 0 1/2 1/2 P-1100 1/3 1/3 1/3 and initial distribution a (1/2,0, 1/2). Find the following: (a) P(X2=1 | X1-3) (b) P(X1 = 3, X2-1) Answers (in random order): 0.6,-2,-1,0, 1,2),5/36, 19/64,15/17.1/3 1-p p 1-p 0 1 00 0 1-p 0 114 0 3/4 1/21/2), 2/31/3). 0 1-p 0 p 0 1-p...

  • Consider the following Markov chain with the following transition diagram on states (1,2,3 2 1/3 1 1/4 2 3 s this Mar...

    Consider the following Markov chain with the following transition diagram on states (1,2,3 2 1/3 1 1/4 2 3 s this Markov chain irreducible? 1 marks (a) (b) Find the probability of the Markov chain to move to state 3 after two time steps, providing it starts in state 2 [3 marks 14 Find the stationary distribution of this Markov chain [4 marks (c) (d) Is the stationary distribution also a limiting distribution for this Markov chain? Explain your answer...

  • Suppose Xn is a Markov chain on the state space S with transition probability p. Let...

    Suppose Xn is a Markov chain on the state space S with transition probability p. Let Yn be an independent copy of the Markov chain with transition probability p, and define Zn := (Xn, Yn). a) Prove that Zn is a Markov chain on the state space S_hat := S × S with transition probability p_hat : S_hat × S_hat → [0, 1] given by p_hat((x1, y1), (x2, y2)) := p(x1, x2)p(y1, y2). b) Prove that if π is a...

  • Consider the Markov chain with state space S = {0,1,2,...} and transition probabilities I p, j=i+1...

    Consider the Markov chain with state space S = {0,1,2,...} and transition probabilities I p, j=i+1 pſi,j) = { q, j=0 10, otherwise where p,q> 0 and p+q = 1.1 This example was discussed in class a few lectures ago; it counts the lengths of runs of heads in a sequence of independent coin tosses. 1) Show that the chain is irreducible.2 2) Find P.(To =n) for n=1,2,...3 What is the name of this distribution? 3) Is the chain recurrent?...

  • 1. Exit times. Let X be a discrete-time Markov chain (with discrete state space) and suppose...

    1. Exit times. Let X be a discrete-time Markov chain (with discrete state space) and suppose pii > 0. Let T =min{n 21: X i} be the exit time from state i. Show that T has a geometric distribution with respect to the conditional probability P 1. Exit times. Let X be a discrete-time Markov chain (with discrete state space) and suppose pii > 0. Let T =min{n 21: X i} be the exit time from state i. Show that...

  • Q4 and Q5 thanks! 4. Consider the Markov chain on S (1,2,3,4,5] running according to the...

    Q4 and Q5 thanks! 4. Consider the Markov chain on S (1,2,3,4,5] running according to the transition probability matrix 1/3 1/3 0 1/3 0 0 1/2 0 0 1/2 P=10 0 1/43/40 0 0 1/2 1/2 0 0 1/2 0 0 1/2 (a) Find inn p k for j, k#1, 2, ,5 (b) If the chain starts in state 1, what is the expected number of times the chain -+00 spends in state 1? (including the starting point). (c) If...

  • Consider a Markov chain with state space S = {1, 2, 3, 4} and transition matrix...

    Consider a Markov chain with state space S = {1, 2, 3, 4} and transition matrix P= where (a) Draw a directed graph that represents the transition matrix for this Markov chain. (b) Compute the following probabilities: P(starting from state 1, the process reaches state 3 in exactly three time steps); P(starting from state 1, the process reaches state 3 in exactly four time steps); P(starting from state 1, the process reaches states higher than state 1 in exactly two...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT