Markov process
A process in which the sequence of events can be described by a Markov chain.
Last updated: 1995-02-23
Nearby terms:
Markov model ♦ Markov process ♦ Markowitz ♦ mark-sweep garbage collection
Try this search on Wikipedia, Wiktionary, Google, OneLook.
Loading