Definition of Markov chain in English:

Markov chain

Syllabification: Mar·kov chain
Pronunciation: /ˈmärˌkôf, -ˌkôv
 
/
(also Markov model)

noun

Statistics
A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
More example sentences
  • Like the previously discussed models, Markov models have serious limitations.
  • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
  • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.

Origin

mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Definition of Markov chain in:

Get more from Oxford Dictionaries

Subscribe to remove adverts and access premium resources

Word of the day oleaginous
Pronunciation: ˌəʊlɪˈadʒɪnəs
adjective
rich in, covered with, or producing oil; oily