Markov-chain Definition
noun
(probability theory) A discrete-time stochastic process with the Markov property.
Wiktionary
Synonyms:
- Markoff chain
Other Word Forms of Markov-chain
Noun
Singular:
markov-chainPlural:
Markov chainsFind Similar Words
Find similar words to markov-chain using the buttons below.