Markov process
<probability, simulation> A process in which the sequence of events can
be described by a Markov chain.
(1995-02-23)
Nearby terms:
Markov « Markov chain « Markov model « Markov
process
» Markowitz » mark-sweep garbage collection » markup
|