Markov process

or Markoff process


noun Statistics.

a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.

Origin of Markov process

1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it