In: Statistics and Probability
The computer center at Rockbottom University has been experiencing computer downtime. Let us assume that the trials of an associated Markov process are defined as one-hour periods and that the probability of the system being in a running state or a down state is based on the state of the system in the previous period. Historical data show the following transition probabilities: To From Running Down Running 0.70 0.30 Down 0.20 0.80 If the system is initially running, what is the probability of the system being down in the next hour of operation? If required, round your answers to two decimal places. The probability of the system is What are the steady-state probabilities of the system being in the running state and in the down state? If required, round your answers to two decimal places. = =
SOLUTION:
From given data,
Historical data show the following transition probabilities
From | Running | Down |
Running | 0.70 | 0.30 |
Down | 0.20 | 0.80 |
If the system is initially running, what is the probability of the system being down in the next hour of operation? If required, round your answers to two decimal places.
Probability of system being down in next hour of operation = 0.30
What are the steady-state probabilities of the system being in the running state and in the down state? If required, round your answers to two decimal places
Let us consider
x = 0.7 x + 0.2 y
Let
=
P(System Running) = x
and =
P(System Down) = y
then using given table we get equations:
x = 0.7 x + 0.2 y
or 0.3 x - 0.2 y=0 -----(i)
y = 0.3 x + 0.8 y
or - 0.3 x + 0.2 y
or 0.3 x -0.2 y --- -(ii)
we see that (i) and (ii) both are same equations so system is
dependent so we can rewrite
y=0.3 / 0.2 x or y=1.5 x
we know that sum of probabilities =1
then x+y=1
x+1.5 x=1
2.5 x=1
x=1/ 2.5
x =0.4
plug into y=1.5 x = 0.6
Hence the steady state distribution is given by:
= x =
0.4
= y =
0.6