Press "Enter" to skip to content

Spectrum of a Markov matrix

Let $latex M$ be an $latex n\times n$ Markov matrix (e.g. a stochastic matrix). This means that $latex M$ has entries in $latex [0,1]$ and each row sums up to $latex 1$. Let $latex \lambda_1,\ldots,\lambda_n$ be the roots in $latex \mathbb{C}$ of its characteristic polynomial. Let us order them in such a way that $latex |\lambda_1|\geq\cdots\geq|\lambda_n|$ with growing phases. It is not difficult to show that $latex \{\lambda_1,\ldots,\lambda_n\}\subset\{z\in\mathbb{C}:|z|\leq1\}$, that $latex \lambda_1=1$, and that the spectrum of $latex M$ is symmetric with respect to the real axis. A good reference for the spectrum of Markov matrices is the book of Seneta.

Suppose from now on that $latex M$ is irreducible. Then the eigenspace associated to $latex \lambda_1$ has dimension $latex 1$ and $latex M$ admits a unique invariant probability measure $latex \mu$. One can show that the spectrum of $latex M$ is invariant by the rotation of angle $latex 2\pi/d$ where $latex d\in\{1,2,\ldots\}$ is called the period of $latex M$. In particular, there are exactly $latex d$ eigenvalues of unit module, and if $latex M$ is reversible, then its spectrum is real and $latex -1$ belongs to the spectrum if and only if $latex d=2$.

From now on, we suppose additionally that $latex M$ is aperiodic: $latex d=1$. Then $latex |\lambda_2|<1$. In this case, the spectral gap $latex 1-|\lambda_2|$  describes the worst exponential decay to the equilibrium $latex \mu$ in $latex \ell^2(\mu)$ for a Markov chain of kernel $latex M$. This corresponds to the control of the decay of the powers of $latex M-I$ by using its spectral radius.

Beyond the spectral gap, one may ask about some dynamical interpretation of the remaining eigenvalues. If $latex M$ is reversible, then  $latex M$ is self-adjoint in $latex \ell^2(\mu)$ and $latex 1-|\lambda_k|$ for $latex \lambda_k\not\in\{0,1\}$ describes the worst exponential decay to the equilibrium $latex \mu$ in $latex \ell^2(\mu)$ when the initial law of the chain is orthogonal to the eigenspaces of $latex M^\top$ associated to the eigenvalues of higher module. One may also think about the Chen and Saloff-Coste criterion for the $latex \ell^2$ cuttoff of reversible Markov chains. Beyond reversibility, this simple interpretation is still valid if $latex M$ is normal, i.e. if $latex MM^*=M^*M$ where $latex M^*$ is the adjoint of $latex M$ in $latex \ell^2(\mu)$.

The empirical spectral distribution of $latex M$ is the discrete probability measure

$latex \displaystyle\nu:=\frac{1}{n}\delta_{\lambda_1}+\cdots+\frac{1}{n}\delta_{\lambda_n}.$

For every $latex k\in\{0,1,2,\ldots\}$, we have the global spectral identity

$latex \displaystyle \int\!z^k\,d\nu(z)=\frac{1}{n}\sum_{i=1}^n\lambda_i^k=\frac{1}{n}\mathrm{Tr}(M^k)=\frac{1}{n}\sum_{i=1}^nM^k_{i,i}=\frac{1}{n}\sum_{i=1}^np(i,k)$

where $latex p(i,k):=M^k_{i,i}$ is the probability of a loop of lenght $latex k$ rooted at $latex i$ for a Markov chain of kernel $latex M$. This gives a global interpretation of the spectrum: the moments of the empirical spectral distribution of $latex M$ correspond to the mean loops probabilities for the chain.

Let $latex f:\{1,\ldots,n\}\to\mathbb{R}$ be some observable and $latex (X_n)_{n\geq1}$ be a Markov chain on $latex \{1,\ldots,n\}$ with kernel $latex M$. The ergodic theorem (strong law of large numbers for additive functionals) states that regardless of the initial law of the chain, we have

$latex \displaystyle \frac{f(X_0)+\cdots+f(X_{n-1})}{n}\overset{\text{a.s.}}{\underset{n\to\infty}{\longrightarrow}}\mu(f):=\int\!f\,d\mu=\sum_{k=1}^nf(k)\mu(k).$

Let $latex g:\{1,\ldots,n\}\to\mathbb{R}$ be such that $latex (M-I)g=f-\mu(f)$. Note that $latex L:=M-I$ is the generator of $latex M$ and $latex g$ solves the Poisson equation. Then the central limit theorem states that

$latex \displaystyle \sqrt{n}\left(\frac{f(X_0)+\cdots+f(X_{n-1})}{n}-\mu(f)\right)\overset{\text{law}}{\underset{n\to\infty}{\longrightarrow}}\mathcal{N}(0,\sigma(f)^2)$

where $latex \sigma(f)^2:=\mu(g^2)-\mu(g)^2$. Thus, if $latex M$ is reversible then

$latex \displaystyle\sigma(f)^2\leq \frac{\mu(f^2)-\mu(f)^2}{\mathrm{dist}(1,\{\lambda_2,\ldots,\lambda_n\})^2}.$

Note that when $latex M$ and $latex \mu$ are not known, one may estimate $latex \sigma(f)^2$ empirically. See e.g. the work of Trevezas and Limnios and references therein.

One Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Syntax · Style · Tracking & Privacy.