Share this entry
Markov chain Syllabification: Mar·kov chain
Pronunciation: /ˈmärˌkôf/ /-ˌkôv/
(also Markov model)

Definition of Markov chain in English:


A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Example sentences
  • Like the previously discussed models, Markov models have serious limitations.
  • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
  • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.


Mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Definition of Markov chain in:
Share this entry

What do you find interesting about this word or phrase?

Comments that don't adhere to our Community Guidelines may be moderated or removed.

Subscribe to remove ads and access premium resources