markoff chain
Meaning, Definition & Usage
Dictionary
Related
noun
a Markov process for which the parameter is discrete time values
Markov chain
.
WordNet
Previous
marking
markis
markisesse
markka
markman
Next
markov
markova
markovian
marks
marksman