bn:02738681n
Noun Concept
Categories: Markov processes, Markov models
EN
absorbing Markov chain  Fundamental matrix
EN
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. Wikipedia
Definitions
Relations
Sources
EN
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. Wikipedia
NAMED AFTER