bn:01219316n
Noun Named Entity
Categories: Markov processes
EN
Harris chain
EN
In the mathematical study of stochastic processes, a Harris chain is a Markov chain where the chain returns to a particular part of the state space an unbounded number of times. Wikipedia
Definitions
Relations
Sources
EN
In the mathematical study of stochastic processes, a Harris chain is a Markov chain where the chain returns to a particular part of the state space an unbounded number of times. Wikipedia
Type of stochastic Markov process Wikidata
Wikipedia
Wikidata
Wikipedia Redirections