In: Statistics and Probability
The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4 P
0 | 1 | 2 | 3 | |
---|---|---|---|---|
0 | .4 | .3 | .2 | .1 |
1 | .2 | .2 | .2 | .4 |
2 | .25 | .25 | .5 | 0 |
3 | .2 | .1 | .4 | .3 |
If Xnot = 1
(a) find the probability that state 3 is entered before state
4;
(b) find the mean number of transitions until either state 3 or
state 4 is entered.