In: Statistics and Probability
The following is the transition probability matrix of a Markov chain with states
1,2,3,4
⎛⎞
.4 .3 .2 .1 P=⎜.2 .2 .2 .4⎟ ⎝ .25 .25 .5 0 ⎠
.2 .1 .4 .3
(a) find the probability that state 3 is entered before state
4;
If X0 = 1
(b) find the mean number of transitions until either state 3 or
state 4 is entered.
Answer:-
Given That:-
the transition probability matrix of a Markov chain with states 1,2,3,4
(a)the probability that state 3 is entered before state 4.
Compute
compute
compute
The given markov chain is regular the state transition diagram for the possible transition are .
Thus,
It is indicate
It is full circle and reach any state from any
other state in less than (or) equal to 3. transitions.
The reach state is entered before state 4 means .
we reach the required state in early we just make an extra step from this state back to the same state .
So,
Here we take 3 steps.
The probability that the state 3 is entered before state 4 is
(b)
the mean number of transitions until either state 3 or state 4 is entered.
Using
P given in the problem .
gives steady state probabilities
II =
Hence,
= 3 steps
= 6 steps .
which is required solution.