Share this entry

Share this page

Markov chain

Syllabification: Mar·kov chain
Pronunciation: /ˈmärˌkôf, -ˌkôv
 
/
(also Markov model)

Definition of Markov chain in English:

noun

Statistics
A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Example sentences
  • Like the previously discussed models, Markov models have serious limitations.
  • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
  • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.

Origin

mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Definition of Markov chain in:

Share this entry

Share this page

 

What do you find interesting about this word or phrase?

Comments that don't adhere to our Community Guidelines may be moderated or removed.

Get more from Oxford Dictionaries

Subscribe to remove adverts and access premium resources

Word of the day peart
Pronunciation: pɪət
adjective
lively; cheerful