Markov Process Definition
märkôf
noun
A chain of random events in which only the present state influences the next future state, as in a genetic code.
Webster's New World
Synonyms:
- markoff process
Related Articles
Find Similar Words
Find similar words to Markov process using the buttons below.