In: Statistics and Probability
Expected number of time intervals until the Markov Chain below is in state 2 again after starting in state 2?
Matrix:
[.4,.2,.4]
[.6,0,.4]
[.2,.5,.3]
The mean recurrence time for state i is ri = 1/wi, where wi is the ith component of the fixed probability vector for the transition matrix.
We need to find the fixed probability vector for the transition matrix W = [a, b, c], where
WP = W
where
which gives,
0.4a + 0.6b + 0.2c = a => -0.6a + 0.6b + 0.2c = 0 ---(1)
0.2a + 0.5c = b => 0.2a - b + 0.5c = 0 --(2)
0.4a + 0.4b + 0.3c = c => 0.4a + 0.4b - 0.7c = 0 --(3)
Also, a + b + c = 1 ---(4)
From (1) and (2)
-0.6a + 0.6b + 0.2c = 0
0.6 * (0.2a - b + 0.5c) = 0 => 0.12a - 0.6b + 0.3c = 0
Adding above equations, we get
-048a + 0.5c = 0 --(5)
Adding (2) and (4), we get
1.2a + 1.5c = 1 => 0.4(1.2a + 1.5c )= 0.4 => 0.48a + 0.6c = 0.4 ---(6)
Assing (5) and (6), we get
1.1c = 0.4 => c = 0.4/1.1 = 0.3636364
Now, -048a + 0.5c = 0 => 0.48a = 0.5 * 0.3636364
=> a = (0.5/0.48) * (0.3636364) = 0.3787879
b = 1 - (a + c) = 1 - (0.3787879 + 0.3636364) = 0.2575757
So, the fixed probability vector for the transition matrix W = [0.3787879, 0.2575757 , 0.3636364]
So, Expected number of time intervals until the Markov Chain below is in state 2 again after starting in state 2, is mean recurrence time for state 2 is r2 = 1/b = 1 / 0.2575757 = 3.882354