Share this entry

Share this page

Markov model

Line breaks: Mar¦kov model
Pronunciation: /ˈmɑːkɒf
 
/
(also Markov chain)

Definition of Markov model in English:

noun

Statistics
A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Example sentences
  • Like the previously discussed models, Markov models have serious limitations.
  • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
  • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.

Origin

mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Definition of Markov model in:

Share this entry

Share this page

 

What do you find interesting about this word or phrase?

Get more from Oxford Dictionaries

Subscribe to remove adverts and access premium resources

Word of the day Sprachgefühl
Pronunciation: ˈʃprɑːxɡəˌfuːl
noun
intuitive understanding of a language’s natural idiom…