Markov chain

From Wiktionary, the free dictionary
Jump to navigation Jump to search

English

[edit]

Noun

[edit]

Markov chain (plural Markov chains)

  1. (probability theory) A discrete-time stochastic process containing a Markov property.
    • 2004 July 27, F. Keith Barker et al., “Phylogeny and diversification of the largest avian radiation”, in PNAS, page 11040, column 2:
      The probability density of the Bayseian posterior was estimated by Metropolis-coupled Markov chain Monte Carlo, with multiple incrementally heated chains.

Hypernyms

[edit]

Hyponyms

[edit]

Translations

[edit]

See also

[edit]