markov chain
Meaning, Definition & Usage
Dictionary
Related
noun
a Markov process for which the parameter is discrete time values
Markoff chain
.
WordNet
Previous
markis
markisesse
markka
markman
markoff
Next
markova
markovian
marks
marksman
marksmanship