Donnerstag, 3. Dezember 2009

Ergodicity of Markov chains

1- A Markov chain is called an ergodic chain if it is possible to go from every state to every state (not necessarily in one move).

2- A Markov chain is called a regular chain if some power of the transition matrix has only positive elements.

3- Any transition matrix that has no zeros determines a regular Markov chain.

4- However, it is possible for a regular Markov chain to have a transition matrix that has zeros.

For an ergodic Markov chain P, there is a unique probability vector w such that wP = w and w is strictly positive

source:
http://www.math.dartmouth.edu/archive/m20x06/public_html/Lecture15.pdf

Keine Kommentare:

Über mich

Mein Bild
it takes courage to be gentle and kind