In: Statistics and Probability
Let Xn is a simple random walk (p = 1/2) on {0, 1, · · · , 100} with absorbing boundaries. Suppose X0 = 50. Let T = min{j : Xj = 0 or N}. Let Fn denote the information contained in X1, · · · , Xn.
(1) Verify that Xn is a martingale.
(2) Find P(XT = 100).
(3) Let Mn = X2 n − n. Verify that Mn is also a martingale.
(4) It is known that Mn and T satisfy the assumptions of Optimal Sampling Theorem and hence E(MT ) = E(M0). Use this fact to find out E(T).
Definition 7.19. We say a sub-probability, π : S → [0, 1] , is invariant if πP = π, i.e. X i∈S π (i) Pij = π (j) for all j ∈ S. (7.16) An invariant probability, π : S → [0, 1] , is called an invariant distribution. Example 7.20. If # (S) < ∞ and p : S × S → [0, 1] is a Markov transition matrix with column sums adding up to 1 then π (i) := 1 #(S) is an invariant distribution for p. In particular, if p is a symmetric Markov transition matrix (p (i, j) = p (j, i) for all i, j ∈ S), then the uniform distribution π is an invariant distribution for p. Example 7.21. If S is a finite set of nodes and G be an undirected graph on S, i.e. G is a subset of S × S such that 1. (x, x) ∈/ G for all x ∈ S, 2. if (x, y) ∈ G, then (y, x) ∈ G [the graph is undirected], and 3. for all x ∈ G, the set Sx := {y ∈ S : (x, y) ∈ G} is not empty. [We are not allowing for any isolated nodes in our graph.] Let ν (x) := # (Sx) = X y∈S 1(x,y)∈G be the valence of G at x. The random walk on this graph is then the Markov chain on S with Markov transition matrix, p (x, y) := 1 ν (x) 1Sx (y) = 1 ν (x) 1(x,y)∈G. Notice that X x∈S ν (x) p (x, y) = X x∈S 1(x,y)∈G = X x∈S 1(y,x)∈G = ν (y). Thus if we let Z := P x∈S ν (x) and π (x)