Compartir esta entrada

Markov chain División en sílabas: Mar·kov chain
Pronunciación: /ˈmärˌkôf/ /-ˌkôv/
(also Markov model)

Definición de Markov chain en inglés:

sustantivo

Statistics
A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.
Oraciones de ejemplo
  • Like the previously discussed models, Markov models have serious limitations.
  • This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.
  • He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.

Origen

Mid 20th century: named after Andrei A. Markov (1856–1922), Russian mathematician.

Definición de Markov chain en:

Compartir esta entrada

 

¿Qué te llama la atención de esta palabra o frase?

Los comentarios que no respeten nuestras Normas comunitarias podrían ser moderados o eliminados.

Obtenga más de Oxford Dictionaries

Suscribirse para eliminar anuncios y acceder a los recursos premium