Question

In: Statistics and Probability

Expected number of time intervals until the Markov Chain below is in state 2 again after...

Expected number of time intervals until the Markov Chain below is in state 2 again after starting in state 2?

Matrix:

[.4,.2,.4]

[.6,0,.4]

[.2,.5,.3]

Solutions

Expert Solution

The mean recurrence time for state i is ri = 1/wi, where wi is the ith component of the fixed probability vector for the transition matrix.

We need to find the fixed probability vector for the transition matrix W = [a, b, c], where

WP = W

where

which gives,

0.4a + 0.6b + 0.2c = a => -0.6a + 0.6b + 0.2c = 0 ---(1)

0.2a + 0.5c = b => 0.2a - b + 0.5c = 0 --(2)

0.4a + 0.4b + 0.3c = c =>  0.4a + 0.4b - 0.7c = 0 --(3)

Also, a + b + c = 1 ---(4)

From (1) and (2)

-0.6a + 0.6b + 0.2c = 0

0.6 * (0.2a - b + 0.5c) = 0 => 0.12a - 0.6b + 0.3c = 0

Adding above equations, we get

-048a + 0.5c = 0 --(5)

Adding (2) and (4), we get

1.2a + 1.5c = 1    => 0.4(1.2a + 1.5c )= 0.4 => 0.48a + 0.6c = 0.4 ---(6)

Assing (5) and (6), we get

1.1c = 0.4 => c = 0.4/1.1 =  0.3636364

Now,   -048a + 0.5c = 0 => 0.48a = 0.5 * 0.3636364  

=> a = (0.5/0.48) * (0.3636364) = 0.3787879

b = 1 - (a + c) = 1 - (0.3787879 + 0.3636364) = 0.2575757

So, the fixed probability vector for the transition matrix W = [0.3787879, 0.2575757 , 0.3636364]

So, Expected number of time intervals until the Markov Chain below is in state 2 again after starting in state 2, is mean recurrence time for state 2 is r2 = 1/b = 1 / 0.2575757 = 3.882354


Related Solutions

Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7...
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7 .3 .3 .4 .6 .3 .1 a) find E[X1|X0=2] = b)  The P(X9=1|X7=3) = C) The P(X2=2) =
A (time-homogeneous) Markov chain built on states A and B is depicted in the diagram below....
A (time-homogeneous) Markov chain built on states A and B is depicted in the diagram below. What is the probability that a process beginning on A will be on B after 2 moves? consider the Markov chain shown in Figure 11.14. Figure 11.14- A state transition diagram. Is this chain irreducible? Is this chain aperiodic? Find the stationary distribution for this chain. Is the stationary distribution a limiting distribution for the chain?
Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2,...
Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2, 3}, and transition probability matrix (pij ) given by   2 3 1 3 0 0 1 3 2 3 0 0 0 1 4 1 4 1 2 0 0 1 2 1 2   Determine all recurrent states. Q3. Let {Xn|n ≥ 0} is a Markov chain with state space S = {0, 1, 2} and transition probability matrix (pij...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4 0.2 0.4 P = 0.6 0.3 0.1 0.5 0.3 0.2 And initial probability vector a = [0.2, 0.3, 0.5] For the Markov Chain with state-space, initial vector, and transition matrix discuss how we would calculate the follow; explain in words how to calculate the question below. a) P(X1 = 0, X2 = 0, X3 = 1, X4 = 2|X0 = 2) b) P(X2 =...
Resolve this in R Consider a Markov chain on {0,1,2, ...} such that from state i,...
Resolve this in R Consider a Markov chain on {0,1,2, ...} such that from state i, the chain goes to i + 1 with probability p, 0 <p <1, and goes to state 0 with probability 1 - p. a) Show that this string is irreducible. b) Calculate P0 (T0 = n), n ≥ 1. c) Show that the chain is recurring.
Prove that for a Markov chain on a finite state space, no states are null recurrent.
Prove that for a Markov chain on a finite state space, no states are null recurrent.
Q4. Prove or disprove that τ is a stopping time (with respect to the Markov chain...
Q4. Prove or disprove that τ is a stopping time (with respect to the Markov chain {Xn|n ≥ 0} iff {τ > n} ∈ σ(X0, · · · , Xn), ∀ n ≥ 0 Q5. Prove or disprove that τ is a stopping time (with respect to the Markov chain {Xn|n ≥ 0}) iff {τ ≥ n} ∈ σ(X0, · · · , Xn), ∀ n ≥ 0 Q6. Let {Xn|n ≥ 0} be a Markov chain and A ⊂...
Consider a Markov chain {Xn|n ≥ 0} with state space S = {0, 1, · ·...
Consider a Markov chain {Xn|n ≥ 0} with state space S = {0, 1, · · · } and transition matrix (pij ) given by pij = 1 2 if j = i − 1 1 2 if j = i + 1, i ≥ 1, and p00 = p01 = 1 2 . Find P{X0 ≤ X1 ≤ · · · ≤ Xn|X0 = i}, i ≥ 0 . Q2. Consider the Markov chain given in Q1. Find P{X1,...
Q1. Let {Xn|n ≥ 0} is a Markov chain with state space S. For i ∈...
Q1. Let {Xn|n ≥ 0} is a Markov chain with state space S. For i ∈ S, define τi = min{n ≥ 0|Xn = i}. Show that τi is a stopping time for each i. Q2. Let τi as in Q1. but for any discrete time stochastic process. Is τi a stopping time? Q3. Let {Xn|n ≥ 0} be a Markov chain and i is a state. Define the random time τ = min{n ≥ 0|Xn+1 = i}. If τ...
Now assume that you are given an N-state Markov chain, in which each state has bi-directional...
Now assume that you are given an N-state Markov chain, in which each state has bi-directional connections with its two neighboring states (i.e., the neighboring states of S1 are S2 and SN; the neighboring states of S2 are S1 and S3, ..., and the neighboring states of SN−1 are SN−2 and SN ). Identify under what conditions (i.e., what values of N) will this N-state Markov chain have a periodic recurrent class, and justify your answer
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT