Meaning of markoff chain

Definition of markoff chain

(noun) a Markov process for which the parameter is discrete time values

Other information on markoff chain

WIKIPEDIA results for markoff chain
Amazon results for markoff chain