Markov chain
or Mar·koff chain
[ mahr-kawf ]
/ ˈmɑr kɔf /
noun Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Origin of Markov chain
First recorded in 1940–45; see origin at
Markov process
British Dictionary definitions for markov chain
Markov chain
/ (ˈmɑːkɒf) /
noun
statistics
a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin for Markov chain
C20: named after Andrei
Markov (1856–1922), Russian mathematician