Markov process

<probability, simulation> A process in which the sequence of events can be described by a Markov chain.

Last updated: 1995-02-23

Try this search on Wikipedia, OneLook, Google

Nearby terms: Markov « Markov chain « Markov model « Markov process » Markowitz » mark-sweep garbage collection » markup


Loading

Copyright Denis Howe 1985

directoryold.com. General Business Directory. http://hotbookee.com.