Wiktionary
a. (context mathematics stochastic processes of a Markov chain English) a row vector whose entries sum to that satisfies the equation , where is the transition matrix of the Markov chain.
Wikipedia
Stationary distribution may refer to:
- The limiting distribution in a Markov chain
- The marginal distribution of a stationary process or stationary time series
- The set of joint probability distributions of a stationary process or stationary time series
In some fields of application, the term stable distribution is used for the equivalent of a stationary (marginal) distribution, although in probability and statistics the term has a rather different meaning: see stable distribution.
Crudely stated, all of the above are specific cases of a common general concept. A stationary distribution is a specific entity which is unchanged by the effect of some matrix or operator: it need not be unique. Thus stationary distributions are related to eigenvectors for which the eigenvalue is unity.