bn:02738681n
Noun Concept
PT
absorvente cadeia de markov
EN
In the mathematical theory of probability, an absorbing Markov chain is a Markov chain in which every state can reach an absorbing state. Wikipedia
Relations
Sources
Wikipedia Translations