In: Statistics and Probability
Q1. Let {Xn|n ≥ 0} is a Markov chain with state space S. For i ∈ S, define τi = min{n ≥ 0|Xn = i}. Show that τi is a stopping time for each i. Q2. Let τi as in Q1. but for any discrete time stochastic process. Is τi a stopping time? Q3. Let {Xn|n ≥ 0} be a Markov chain and i is a state. Define the random time τ = min{n ≥ 0|Xn+1 = i}. If τ a stopping time? Justify your answer. Q4. Prove or disprove that τ is a stopping time (with respect to the Markov chain {Xn|n ≥ 0} iff {τ > n} ∈ σ(X0, · · · , Xn), ∀ n ≥ 0 Q5. Prove or disprove that τ is a stopping time (with respect to the Markov chain {Xn|n ≥ 0}) iff {τ ≥ n} ∈ σ(X0, · · · , Xn), ∀ n ≥ 0 Q6. Let {Xn|n ≥ 0} be a Markov chain and A ⊂ SQ6. Let {Xn|n ≥ 0} be a Markov chain and A ⊂ S = {1, 2, · · · } such that A 6= S, 1 ∈ A. Define (i) τ1 = min{n ≥ 0|Xn+1 ∈ A}, . (ii) τ2 = min{n ≥ τ1|Xn = 1} Are τ1, τ2 stopping times? Justify your answer