Definición de Markov chain en inglés:

Markov chain

Silabificación: Mar·kov chain
Pronunciación: /ˈmärˌkôf, -ˌkôv
 
/
(also Markov model)

sustantivo

Statistics
  • A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
    Más ejemplos en oraciones
    • Like the previously discussed models, Markov models have serious limitations.
    • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
    • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.

Origen

mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Más definiciones de Markov chain 

Definición de Markov chain en: 

Obtener más de Oxford Dictionaries

Subscribirse para eliminar anuncios y acceder a los recursos premium

Palabra del día milord
Pronunciación: mɪˈlɔːd
noun
used to address an English nobleman