taxicab driver moves between the airport A and two hotels B and C according to the...
3. A spider hunts a fly moving between the positions 1 and 2 according to a Markov Chain P= 0.3 0.7 The fly, independently of the spider, moves between 1 and 2 according to a second Markov with transition matrix 0.0.3 Chain whose transition matrix is 0.4 0.6 0.6 0.4 The hunt finishes the first time both the spider and the fly are on the same position, (a) Describe the hunt with a suitable 3 states Markov Chain; (b) Assuming...
3. A spider hunts a fly moving between the positions 1 and 2 according to a Markov Chain with transition matrix 0.7 0.3 The fly, independently of the spider, moves between 1 and 2 according to a second Markov Chain whose transition matrix is 0.4 0.6 0.6 0.4 -(04) The hunt finishes the first time both the spider and the fly are on the same position, (a) Describe the hunt with a suitable 3 states Markov Chain (b) Assuming that...
I
need help with each question, please
10. Charles and Denise also live in Taxicab City. Charles works at the cupcake D (2,-1 shop at C-(-2,3) and Denise works at the donut shop at Charles and Denise are looking for places to live, but they do not necessarily have to live at street corners. Put your answers to these questions on separate graphs ansuvers to these questions on scparade gruphs (a) Is it possible for Charles and Denise to live...
Problem 4. Smith has three records A,B,C which he keeps in a stack. (These are like mp3 files, but come in the shape of a physical flat disc!) After he plays a record, he puts it at the top of the stack. His favourite is A, which he selects to listen with prob- ability 2/3. He selects B with probability 1/6, and he selects C also with probability 1/6. This defines a Markov chain, with the state of the system...
Q4 and Q5
thanks!
4. Consider the Markov chain on S (1,2,3,4,5] running according to the transition probability matrix 1/3 1/3 0 1/3 0 0 1/2 0 0 1/2 P=10 0 1/43/40 0 0 1/2 1/2 0 0 1/2 0 0 1/2 (a) Find inn p k for j, k#1, 2, ,5 (b) If the chain starts in state 1, what is the expected number of times the chain -+00 spends in state 1? (including the starting point). (c) If...
Suppose Alice is sitting at a circular table with 4 chairs
labeled {1, 2, 3, 4} and sitting initially at a random chair. Every
minute she moves to her left or right at random with equal
probability. Consider the Markov chain associated to the sequence
of her positions X0, X1, . . . .
1. Write the state space, the distribution of X0 and the
distribution of X1.
2. Write the transition matrix.
3. Assume she is at chair one...
Problem 3. (2 points) Consumers in Shelbyville have a choice of one of two fast food restaurants, Krusty's and McDonald's. Both have trouble keeping customers. Of those who last went to Krusty's, 56% will go to McDonald's next time, and of those who last went to McDonald's, 82% will go to Krusty's next time. (a) Find the transition matrix describing this situation (assume that state 1 is "Krusty's" and state 2 is "McDonald's"). (b) A customer goes out for fast...
Hello, please use Markov process for the problem. Please make
the explanations simple and understandable, I don't have a
statistics background. Thank you!
4.3 A taxi driver conducts his business in three different towns 1, 2, and 3. On any given day, when he is in town 1, the probability that the next passenger he picks up is going to a place in town 1 is 0.3, the probability that the next passenger he picks up is going to town...
John finds a bill on his desk. He has three options: ignore it and leave it on his own desk, move the bill over to his wife Mary's desk, or pay the bill immediately. The probability that he leaves it on his own desk is 0.2. The probability that he moves it to Mary's desk is 0.7. The probablity that he pays the bill immediately is 0.1 Similarly, if Mary finds a bill on her desk she can choose to...
Problem 2. Tlaloc has 4 umbrellas, each either t home or at work. Each time he goes to work or back, it rains with probability p, independently of all other times. If it rains and there is at least one umbrella with him, he takes it. Otherwise, he gets et. Let Xn be the number of umbrellas at his current location after n trips (so n even corresponds to home and n odd to work. a) Find the transition probabilities...