Word Study
markoff chain
WORDNET DICTIONARY
Noun markoff chain has 1 sense
- markoff chain(n = noun.process) markov chain - a Markov process for which the parameter is discrete time values; Array is a kind of markoff process, markov process
For further exploring for "markoff chain" in Webster Dictionary Online