Meaning of markov process

Definition of markov process

(noun) a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state

Other information on markov process

WIKIPEDIA results for markov process
Amazon results for markov process