Markov process

<probability, simulation>

A process in which the sequence of events can be described by a Markov chain.

Last updated: 1995-02-23

Nearby terms:

Markov modelMarkov processMarkowitzmark-sweep garbage collection

Try this search on Wikipedia, Wiktionary, Google, OneLook.



Loading