Markov chain
Markov chain
Memoryless
The probability of which state a random process will be in the future is independent of any state experienced in the past, and if a random process satisfies this property, it is a Markov chain.
one step transition probability
Contingent probability
This is called the probability of the Markov chain moving from
When this probability is independent of
Since the Markov chain must transition to some definite state, the sum of the one-step transition probabilities of all states is 1:
By arranging all the step transition probabilities into a square matrix, we get the transition probability matrix:
Multistep transition probability
Contingent probability
It is called the transition probability of Markov chain
If
The multistep transition probability can also be approximated by experiment and Kolmogorov-Chapman equation:
Written in moment form:
In other words, the n-step transition probability matrix is one step transition probability matrix to the
Ergodicity
What happens to the transition probability as the number of steps goes to infinity?
For the general two-state Markov chain:
The n-step transition probability can be calculated:
Then the n-step transition probability has a limit:
That is to say, for a particular state
Limiting distribution
Since
which is the limiting distribution of the chain.
If we can find a positive integer