Question

In: Advanced Math

Consider the following Markov chain with P{X0 = 2} = 0.6 and P{X0 = 4} =...

Consider the following Markov chain with P{X0 = 2} = 0.6 and P{X0 = 4} = 0.4:

1 2 3 4 5 6

1 0 0 0 0 1 0

2 .2 .05 0 .6 0 .15

3 0 0 .8 0 0 .2

4 0 .6 0 .2 0 .2

5 1 0 0 0 0 0

6 0 0 .7 0 0 .3

a. What is P{X1 = 4, X2 = 6 | X0 = 2}?

b. What is P{X2 = 6 | X0 = 2}? What is P{X18 = 6 | X16 = 2}?

c. What is P{X0 = 2, X1 = 4, X2 = 6}?

d. What is P{X1 = 4, X2 = 6}?

Solutions

Expert Solution

The matrix is

Starting at , the distribution of is

and the distribution of is

Also, given , the distribution of is

Thus, , and . Now, by law of total probability, and Markovian property,

b) As found above, . By Markovian property, we have

c) We have

d) .


Related Solutions

The following is the transition probability matrix of a Markov chain with states 1,2,3,4 ⎛⎞ .4...
The following is the transition probability matrix of a Markov chain with states 1,2,3,4 ⎛⎞ .4 .3 .2 .1 P=⎜.2 .2 .2 .4⎟ ⎝ .25 .25 .5 0 ⎠ .2 .1 .4 .3 (a) find the probability that state 3 is entered before state 4; If X0 = 1 (b) find the mean number of transitions until either state 3 or state 4 is entered.
Let Xn be the Markov chain with states S = {1, 2, 3, 4} and transition...
Let Xn be the Markov chain with states S = {1, 2, 3, 4} and transition matrix. 1/3 2/3 0 0 2/3 0 1/3 0 1/3 1/3 0 1/3 0 1/3 2/3 0 a.) Let X0 = 3 and let T3 be the first time that the Markov chain returns 3, compute P(T3 = 2 given X0=3). Please show all work and all steps. b.) Find the stationary distribution π. Please show all work and all steps.
2. If P(A) = 0.4 and P(B) = 0.6, which of the following must be TRUE?...
2. If P(A) = 0.4 and P(B) = 0.6, which of the following must be TRUE? Select one: A. Events A and B are collectively exhaustive. B. None of the other three choices must be true. C. Events A and B are mutually exclusive. D. Events A and B are statistically independent. 3.A researcher reported that the 95% confidence interval for the mean ranged from 460 to 540. He knew that the population is normally distributed. He was sure that...
Consider the following Markov chain: 0 1 2 3 0 0.3 0.5 0 0.2 1 0.5...
Consider the following Markov chain: 0 1 2 3 0 0.3 0.5 0 0.2 1 0.5 0.2 0.2 0.1 2 0.2 0.3 0.4 0.1 3 0.1 0.2 0.4 0.3 What is the probability that the first passage time from 2 to 1 is 3? What is the expected first passage time from 2 to 1? What is the expected first passage time from 2 to 2 (recurrence time for 2)? What is the relation between this expectation and the steady-state...
Is the following statement always true? “If an irreducible Markov chain has period 2, then for...
Is the following statement always true? “If an irreducible Markov chain has period 2, then for every state i∈S we have P2ii>0.” (Prove if “yes”, provide a counterexample if “no”)
The following is the transition probability matrix of a Markov chain with states 1, 2, 3,...
The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4 P 0 1 2 3 0 .4 .3 .2 .1 1 .2 .2 .2 .4 2 .25 .25 .5 0 3 .2 .1 .4 .3 If Xnot = 1 (a) find the probability that state 3 is entered before state 4; (b) find the mean number of transitions until either state 3 or state 4 is entered.
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7...
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7 .3 .3 .4 .6 .3 .1 a) find E[X1|X0=2] = b)  The P(X9=1|X7=3) = C) The P(X2=2) =
Resolve this in R Consider a Markov chain on {0,1,2, ...} such that from state i,...
Resolve this in R Consider a Markov chain on {0,1,2, ...} such that from state i, the chain goes to i + 1 with probability p, 0 <p <1, and goes to state 0 with probability 1 - p. a) Show that this string is irreducible. b) Calculate P0 (T0 = n), n ≥ 1. c) Show that the chain is recurring.
As in Example 2.20 of the 01-29 version of the lecture notes, consider the Markov chain...
As in Example 2.20 of the 01-29 version of the lecture notes, consider the Markov chain with state space S = {0, 1} and transition probability matrix P = " 1 2 1 2 0 1# . (a) Let µ be an initial distribution. Calculate the probability Pµ(X1 = 0, X7 = 1). (Your answer will depend on µ.) (b) Define the function f : S → R by f(0) = 2, f(1) = 1. Let the initial distribution of...
Given the transition matrix P for a Markov chain, find the stable vector W. Write entries...
Given the transition matrix P for a Markov chain, find the stable vector W. Write entries as fractions in lowest terms. P= 0.5 0 0.5     0.2 0.2 0.6       0    1     0
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT