Question
could u help me with this question about the markov chain please?

We have 5 urns which are empty. One ball is placed randomly in one of the urns.The process keep repear until each of the urn have at least 1 ball.Let Xn denoted the number of non-empty urns on day n.

(a) Show {Xn is a Markov chain by giving its transition matrix. (b) Find P(X 3 2) and P(Xs 2 4) (c) Find the expected number
0 0
Add a comment Improve this question Transcribed image text
Answer #1

a) The given process is a markov chain, because the value of Xn for a particular day only depends on what happened on the nth day and the process value on the (n-1)th day that is on Xn-1.

This can be represented by the transition matrix as shown below:

X_n = \begin{pmatrix} 0 &1 &0 &0 &0 &0 \\ 0 &0.2 &0.8 & 0 & 0 &0 \\ 0 & 0 &0.4 & 0.6 & 0 &0 \\ 0 & 0 & 0 & 0.6 &0.4 & 0\\ 0 & 0 & 0&0 &0.8 &0.2 \\ 0 & 0 & 0 & 0 & 0 & 1 \end{pmatrix}

In the above matrix, the 6 states are: 0, 1, 2, 3, 4, and 5

b) P(X3 <= 2) is the probability of having 2 or less than 2 is only possible if we are putting 1 ball in each of the 3 days continuously that is 1*0.8*0.6 = 0.48

Therefore 0.48 is the required probability here.

P(X8 >= 4) is only possible when we have put 1 ball on the first day and then kept putting balls in the same urn on the next 7 days. Therefore probability here is computed as:

= 1*0.27

= 0.0000128

Therefore the required probability here is 0.0000128

c) The expected number of days for exactly k = 1 urn to be full would be 1as we would put 1 ball on day 1 to one of the urn.

Now for k = 2, let the expected number of days be X,
Then X = 1 + 0.8*1 + 0.2*(X + 1)
0.8X = 2
X = 2.5

Therefore 2.5 is the expected number of days here for k = 2

For k = 3, let the expected number of days be Y. Then,

Y = 2.5 + 0.6*1 + 0.4*(Y + 1)
0.6Y = 3.5
Y = 5.833

Therefore expected number of days here is 5.833 for k = 3

For k = 4, let the expected number be Z, then we have here:

Z = 5.833 + 0.4*1 + 0.6*(Z + 1)
0.4Z = 6.833
Z = 17.08

Therefore 17.08 days is the expected number of days here.

Add a comment
Know the answer?
Add Answer to:
Could u help me with this question about the markov chain please? We have 5 urns which are empty...
Your Answer:

Post as a guest

Your Name:

What's your source?

Earn Coins

Coins can be redeemed for fabulous gifts.

Not the answer you're looking for? Ask your own homework help question. Our experts will answer your question WITHIN MINUTES for Free.
Similar Homework Help Questions
  • 5. (15 points) Four blue and four maize balls are distributed in two urns in such...

    5. (15 points) Four blue and four maize balls are distributed in two urns in such a way that each contains four balls. We say that the systern is in state i, i = (), 1, 2, 3, 4, if the first urn contains i maize balls. At each step, we draw one ba from each urn and place the ball drawn from the first urn into the second, and conversely with the ball from the second urn. Let Xn...

  • 1. Let Xn be a Markov chain with states S = {1, 2} and transition matrix...

    1. Let Xn be a Markov chain with states S = {1, 2} and transition matrix ( 1/2 1/2 p= ( 1/3 2/3 (1) Compute P(X2 = 2|X0 = 1). (2) Compute P(T1 = n|Xo = 1) for n=1 and n > 2. (3) Compute P11 = P(T1 <0|Xo = 1). Is state 1 transient or recurrent? (4) Find the stationary distribution à for the Markov Chain Xn.

  • Problem 5. A Markov chain Xn, n probability matrix: 0 with states 1, 2, 3 has the following trans...

    Problem 5. A Markov chain Xn, n probability matrix: 0 with states 1, 2, 3 has the following transition 0 1/3 2/3 1/2 0 1/2 If P(o-: 1)-P(Xo-2-1/4, calculate E(%) (use a computer). Problem 5. A Markov chain Xn, n probability matrix: 0 with states 1, 2, 3 has the following transition 0 1/3 2/3 1/2 0 1/2 If P(o-: 1)-P(Xo-2-1/4, calculate E(%) (use a computer).

  • Q4 and Q5 thanks! 4. Consider the Markov chain on S (1,2,3,4,5] running according to the...

    Q4 and Q5 thanks! 4. Consider the Markov chain on S (1,2,3,4,5] running according to the transition probability matrix 1/3 1/3 0 1/3 0 0 1/2 0 0 1/2 P=10 0 1/43/40 0 0 1/2 1/2 0 0 1/2 0 0 1/2 (a) Find inn p k for j, k#1, 2, ,5 (b) If the chain starts in state 1, what is the expected number of times the chain -+00 spends in state 1? (including the starting point). (c) If...

  • Q5. Consider a Markov chain {Xn|n ≥ 0} with state space S = {0, 1, ·...

    Q5. Consider a Markov chain {Xn|n ≥ 0} with state space S = {0, 1, · · · } and transition matrix (pij ). Find (in terms QA for appropriate A) P{ max 0≤k≤n Xk ≤ m|X0 = i} . Q6. (Flexible Manufacturing System). Consider a machine which can produce three types of parts. Let Xn denote the state of the machine in the nth time period [n, n + 1) which takes values in {0, 1, 2, 3}. Here...

  • Done (a), please help with (c). Thanks 1. Discussion and Quiz question Two dogs, Hinkler and Moke, have three fleas...

    Done (a), please help with (c). Thanks 1. Discussion and Quiz question Two dogs, Hinkler and Moke, have three fleas between them Every day, dog. On each day, which flea jumps is random, and all three fleas have the same probability of jumping, regardless of which dogs the fleas are on Define Xn to be the number of fleas on Moke after n days. Then (Xn, n = 0, 1, 2, ...) forms a Markov chain with state space S...

  • HELP! please P. The transition matrix for a Markov chain is shown to the right 070...

    HELP! please P. The transition matrix for a Markov chain is shown to the right 070 Find p for k2.4 and 8. Can you identify a matrix that the matrices are approaching? Compute (Type an integer or a decimal for each matie element) Computer p.0 Type anger or a decimal for each element. Round to decimal places as needed Select the below and necessary in the box to complete your choice On You the matrie in only Tormal for each...

  • Consider the Markov chain on state space {1,2, 3,4, 5, 6}

     Consider the Markov chain on state space {1,2, 3,4, 5, 6}. From 1 it goes to 2 or 3 equally likely. From 2 it goes back to 2. From 3 it goes to 1, 2, or 4 equally likely. From 4 the chain goes to 5 or 6 equally likely. From 5 it goes to 4 or 6 equally likely. From 6 it goes straight to 5. (a) What are the communicating classes? Which are recurrent and which are transient? What...

  • (10 points) Consider a Markov chain (Xn)n-0,1,2 probability matrix with state space S ,2,3) and transition...

    (10 points) Consider a Markov chain (Xn)n-0,1,2 probability matrix with state space S ,2,3) and transition 1/5 3/5 1/5 P-0 1/2 1/2 3/10 7/10 0 The initial distribution is given by (1/2,1/6,1/3). Compute (a) P[X2-k for all k- 1,2,3 (b) E[X2] Does the distribution of X2 computed in (a) depend on the initial distribution a? Does the expected value of X2 computed in (b) depend on the nitial distribution a? Give a reason for both of your answers.

  • A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2, 3,...

    A Markov chain {Xn, n ≥ 0} with state space S = {0, 1, 2, 3, 4, 5} has transition probability matrix P. ain {x. " 0) with state spare S-(0 i 2.3.45) I as transition proba- bility matrix 01-α 0 0 1/32/3-3 β/2 0 β/2 0 β/2 β/21/2 0001-γ 0 0 0 0 (a) Determine the equivalence classes of communicating states for any possible choice of the three parameters α, β and γ; (b) In all cases, determine if...

ADVERTISEMENT
Free Homework Help App
Download From Google Play
Scan Your Homework
to Get Instant Free Answers
Need Online Homework Help?
Ask a Question
Get Answers For Free
Most questions answered within 3 hours.
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT