In: Statistics and Probability
Consider a sequence of random variables X0, X1, X2, X3, . . . which form a Markov chain.
(a) Define the Markov property for this Markov chain both in words and using a mathematical formula.
(b) When is a Markov chain irreducible?
(c) Give the definition for an ergodic state.
a) A sequence of random variables is said to be a Markov Chain if Future actions only depends on present action and independent of all other past actions.
mathematically,
Here 1,2,3...n that is suffix of X are called steps and a0,a1,a2....an are called states of the markov chain
b) A Markov chain is said to be irreudicible if every state can be reached from every other state.
Let Pij(n) denote the transition from ith state to jth state at nth step.
Then if markov chain is irreducible then Pij(n) > 0 ; For some n
c) A state of the Markov chain is said to be ergdic if the state is aperiodic and non null persistent.
A aperiodic state is defined as where there is no specifiq period of returning in a particular state. Say at state 1 the process returns at step 1,4,9,11 etc then the state is aperiodic.
A non null persistent state is where the mean recurrence time is finite.