In: Statistics and Probability
As in Example 2.20 of the 01-29 version of the lecture notes, consider the Markov chain with state space S = {0, 1} and transition probability matrix P = " 1 2 1 2 0 1# . (a) Let µ be an initial distribution. Calculate the probability Pµ(X1 = 0, X7 = 1). (Your answer will depend on µ.) (b) Define the function f : S → R by f(0) = 2, f(1) = 1. Let the initial distribution of the Markov chain be µ = [µ(0), µ(1)] = 4 7 , 3 7 . Calculate the expectation Eµ[f(X3)]. In plain English, start the Markov chain with initial distribution µ. Run it until time 3. Collect a reward of $2 if you find yourself in state 0 and a reward of $1 if you find yourself in state 1. What is the expected reward? (Your numerical answer should be 15 14 .)