Question

In: Statistics and Probability

Given the transition matrix P for a Markov chain, find the stable vector W. Write entries...

Given the transition matrix P for a Markov chain, find the stable vector W. Write entries as fractions in lowest terms.

P= 0.5 0 0.5
    0.2 0.2 0.6
      0    1     0

Solutions

Expert Solution

Given the transition matrix P for a Markov chain ,

Let the stable vector be W = [X , Y , Z]

stable vector must satisfy

Wp = W

and X +Y+Z =1 eq (1)

i.e [X,Y,Z] = [X,Y,Z]

0.5X +0.2Y = X (2)

0.2Y+Z = Y   (3)

0.5X+0.6Y =Z (4)

now from equation (2)

0.5 X +0.2Y = X

0.2Y = X-0.5X

0.2Y = 0.5X    (5)

0.2Y +Z =Y

Z = Y -0.2Y

= 0.8Y

Put the equation (4)in equation (3)

0.5X +0.6 Y = 0.8Y

0.5X = 0.8Y - 0.6Y

0.5X = 0.2Y

X = 0.2Y / 0.5

X = 0.4 Y (6)

put the X value in equation (5)

0.2Y = 0.5X

0.2Y = 0.5(0.4Y)

0.2Y = 0.2Y

Y = 1

Now put the Y value in equation (6)

X = 0.4 Y

= 0.4(1)

= 0.4

now keep the X value and Y value in equation (4)

0.5 X + 0.6 Y = Z

0.5(0.4) +0.6 (1) = Z

0.2 +0.6 = Z

Z = 0.8

and W =[X, Y , Z ]

the value of W is

W = [0.4 , 1 , 0.8 ]


Related Solutions

1.14 Let Xn be a Markov chain on state space {1,2,3,4,5} with transition matrix P= 0...
1.14 Let Xn be a Markov chain on state space {1,2,3,4,5} with transition matrix P= 0 1/2 1/2 0 0 0 0 0 1/5 4/5 0 0 0 2/5 3/5 1 0 0 0 0 1/2 0 0 0 1/2 (a) Is this chain irreducible? Is it aperiodic? (b) Find the stationary probability vector.
The following is the transition probability matrix of a Markov chain with states 1,2,3,4 ⎛⎞ .4...
The following is the transition probability matrix of a Markov chain with states 1,2,3,4 ⎛⎞ .4 .3 .2 .1 P=⎜.2 .2 .2 .4⎟ ⎝ .25 .25 .5 0 ⎠ .2 .1 .4 .3 (a) find the probability that state 3 is entered before state 4; If X0 = 1 (b) find the mean number of transitions until either state 3 or state 4 is entered.
You are given a transition matrix P. Find the steady-state distribution vector. HINT [See Example 4.]...
You are given a transition matrix P. Find the steady-state distribution vector. HINT [See Example 4.] P = [0.6 0 0.4 1 0 0 0 0.2 0.8]
You are given a transition matrix P. Find the steady-state distribution vector. HINT [See Example 4.]...
You are given a transition matrix P. Find the steady-state distribution vector. HINT [See Example 4.] P = [0.6 0 0.4 1 0 0 0 0.2 0.8]
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7...
Xn is a discrete-time Markov chain with state-space {1,2,3}, transition matrix, P = .2 .1 .7 .3 .3 .4 .6 .3 .1 a) find E[X1|X0=2] = b)  The P(X9=1|X7=3) = C) The P(X2=2) =
The following is the transition probability matrix of a Markov chain with states 1, 2, 3,...
The following is the transition probability matrix of a Markov chain with states 1, 2, 3, 4 P 0 1 2 3 0 .4 .3 .2 .1 1 .2 .2 .2 .4 2 .25 .25 .5 0 3 .2 .1 .4 .3 If Xnot = 1 (a) find the probability that state 3 is entered before state 4; (b) find the mean number of transitions until either state 3 or state 4 is entered.
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4...
Xn is a Markov Chain with state-space E = {0, 1, 2}, and transition matrix 0.4 0.2 0.4 P = 0.6 0.3 0.1 0.5 0.3 0.2 And initial probability vector a = [0.2, 0.3, 0.5] For the Markov Chain with state-space, initial vector, and transition matrix discuss how we would calculate the follow; explain in words how to calculate the question below. a) P(X1 = 0, X2 = 0, X3 = 1, X4 = 2|X0 = 2) b) P(X2 =...
Find the equilibrium vector for the transition matrix below. 0.4......0.6 0.3......0.7 The equilibrium vector is.... Find...
Find the equilibrium vector for the transition matrix below. 0.4......0.6 0.3......0.7 The equilibrium vector is.... Find the equilibrium vector for the transition matrix below. 0.2...0.2...0.6 0.2...0.4...0.4 0.2...0.3...0.5 The equilibrium vector is... Find the equilibrium vector for the transition matrix below. 0.2...0.2...0.6 0.8...0.1...0.1 0.2...0.4...0.4 The equilibrium vector is ...
Find the equilibrium vector for the transition matrix below. left bracket Start 3 By 3 Matrix...
Find the equilibrium vector for the transition matrix below. left bracket Start 3 By 3 Matrix 1st Row 1st Column 0.3 2nd Column 0.3 3rd Column 0.4 2nd Row 1st Column 0.2 2nd Column 0.4 3rd Column 0.4 3rd Row 1st Column 0.3 2nd Column 0.2 3rd Column 0.5 EndMatrix right bracket, The equilibrium vector is?
2. Set up both the vector of state probabilities and the matrix of transition probabilities given...
2. Set up both the vector of state probabilities and the matrix of transition probabilities given the following information: Store 1 currently has 40% of the market; store 2 currently has 60% of the market. In each period, store 1 customers have an 80% chance of returning; 20% of switching to store 2. In each period, store 2 customers have a 90% chance of returning; 10% of switching to store 1. a. Find the percentage of market for each store...
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT