Find the word definition

The Collaborative International Dictionary
Markov chain

Markov chain \Mark"ov chain\, n. [after A. A. Markov, Russian mathematician, b. 1856, d. 1922.] (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk. [Also spelled Markoff chain.]

Wikipedia
Markov chain

A Markov chain (discrete-time Markov chain or DTMC), named after Andrey Markov, is a random process that undergoes transitions from one state to another on a state space. It must possess a property that is usually characterized as "memorylessness": the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes.

Usage examples of "markov chain".

If Platonia is made into a Markov chain, each sequence of configurations gets its own probability.

The result is to convert Platonia into what statisticians call a `Markov chain', which is just like the list of transition probabilities for snakes and ladders, but more general.

The course of political events, opaque to reconstructions and retrospects from its present state, can be studied only in a model called a Markov chain.

But the King, nothing daunted, put on his Markov chain mail and all his impervious parameters, took his increment to infinity and dealt the beast a truly Boolean blow, sent it reeling through an x-axis and several brackets--but the beast, prepared for this, lowered its horns and wham!