Press "Enter" to skip to content

Large deviations for random matrices

Matrix

I am not fond of the large deviations industry, but I like very much the Sanov theorem associated to the law of large numbers expressed on empirical distributions. The rate function in this large deviation principle is the Kullback-Leibler relative entropy.  It turns out that a very pleasant Sanov type theorem exists for certain models of random matrices. Namely, let $V:\mathbb{R}\to\mathbb{R}$ be a continuous function such that $\exp(-V)$ is Lebesgue integrable and  $V$ grows faster than the logarithm at infinity. For instance, one may take $V(x)=x^2/2$. Let $H_n$ be a random $n\times n$ Hermitian matrix with Lebesgue density proportional to

$$H\mapsto \exp\left(-n\mathrm{Tr}(V(H))\right).$$

Let $\lambda_{n,1},\ldots,\lambda_{n,n}$ be the eigenvalues of $H_n$. Their ordering  does not matter here. These eigenvalues are real since $H_n$ is Hermitian.  The unitary invariance of the law of $H_n$ allows to show that the law of $\lambda_{n,1},\ldots,\lambda_{n,n}$ , quotiented by the action of the symmetric group, has Lebesgue density proportional to

$$(\lambda_{n,1},\ldots,\lambda_{n,n})\mapsto \exp\left(-\sum_{k=1}^n nV(\lambda_{n,k})\right)\prod_{i\neq j}|\lambda_{n,i}-\lambda_{n,j}|^2$$

which can be rewritten as

$$(\lambda_{n,1},\ldots,\lambda_{n,n})\mapsto \exp\left(-\sum_{k=1}^n nV(\lambda_{n,k})+\frac{1}{2}\sum_{i\neq j}\log|\lambda_{n,i}-\lambda_{n,j}|\right).$$

The Vandermonde determinant comes from the Jacobian of the diagonalization seen as a change of variable (integration over the eigenvectors), and can also be seen as the discriminant (resultant of $P, P’$) of the characteristic polynomial $P$. Let us consider the random counting probability measure of these eigenvalues:

$$L_n:=\frac{1}{n}\sum_{k=1}^n\delta_{\lambda_{n,k}}$$

which is a random element of the convex set $\mathcal{P}$ of probability measures on $\mathbb{R}$. A very nice result due to Ben Arous and Guionnet states that for the topology of the narrow convergence, the sequence $(L_n)_{n\geq1}$ satisfies to a large deviation principle with speed $n^2$ and good rate function $\Phi$ given up to an additive constant by

$$\mu\in\mathcal{P}\mapsto \Phi(\mu):=\int\!V\,d\mu-\int\!\int\!\log|x-y|\,d\mu(x)d\mu(y).$$

In other words, we have for every nice set $A\subset\mathcal{P}$, as $n\to\infty$,

$$\mathbb{P}(L_n\in A)\approx \exp\left(-n^2\inf_{\mu\in A}\Phi(\mu)\right).$$

The second term in the definition of $\Phi(\mu)$ is the logarithmic energy of $\mu$ (minus the Voiculescu free entropy). When $V(x)=x^2/2$, then $H_n$ belongs to the Gaussian Unitary Ensemble and  the Wigner semicircle law is a maximum of this entropy under a second moment constraint, see e.g. the book by Saff and Totik. In particular, since the proof does not involve the underlying almost sure convergence theorem (in contrary to the Cramer theorem which involves the law of large numbers), the large deviation principle of Ben Arous and Guionnet yields via the first Borel-Cantelli lemma a new proof of the Wigner theorem.

In my opinion (I have a mathematical physicist soul!) the large deviation principle of Ben Arous and Guionnet is one of the nicest results in random matrix theory. It explains the appearance of the Wigner semicircle law in the Wigner theorem via a maximum entropy or minimum energy under moments constraints. Unfortunately, the result is only available for unitary invariant ensembles and does not cover the case of non Gaussian Wigner matrices with i.i.d. entries, for which the Wigner theorem is still valid. It is tempting to seek for a version of this large deviation principle for such non Gaussian Wigner matrices. The answer is not known. It is not clear that the sole finite positive variance assumption is enough to ensure that the rate function is the one of the Gaussian Unitary Ensemble. It is probable that the rate function will depend on the law of the entries. However, the arg-infimum of this function is still the Wigner semicircle law.

The proof of Ben Arous and Guionnet relies crucially on the explicit knowledge of the law of the spectrum, due to the unitary invariance of the model (roughly, if one puts the Vandermonde determinant into the exponential as a potential, it looks like a discrete Voiculescu entropy). Ben Arous and Zeitouni have used essentially the same method in order to establish a large deviation principle for the non-Hermitian version of the model, yielding a new proof of the circular law theorem for the Ginibre Ensemble.

It could be nice to connect these large deviations principles with transport inequalities. For connections between these two universes, take a look at the recent survey by Gozlan and Leonard.

4 Comments

  1. Djalil Chafaï 2011-06-24

    Recent progress: Chatterjee, S. and Varadhan, S.R.S., Large Deviations for Random Matrices, arXiv:1106.4366 [math.PR]. Their paper concerns another scale (they consider is a sense \(\frac{1}{n}X\) instead of \(\frac{1}{\sqrt{n}}X\)) and another topology, and does not contain the LDP for the GUE.

  2. Yunjiang Jiang 2012-01-12

    Dear Djalil,
    Very nice post indeed! I was wondering if the LDP result mentioned above implies the rigidity of the eigenvalues, i.e., that their “total variations” from the standard semi-circle quantile positions is bounded as n goes to infinity, or perhaps grows slowly like log n? I know of some rigidity result for Wigner matrices in general in Yau et al’s work, but maybe in the GUE case one can get it to O(1)? I am mainly interested in the problem of optimal transport of eigenvalues of Haar distributed Lie groups, such as SO(n), but I imagine the behaviors should be similar.

    YJ

  3. Djalil Chafaï 2012-01-12

    Dear Yunjiang,

    thanks for your feedback. Regarding your question, my feeling is that the weakness of the topology can be reinforced by some norm localization. But since you are in Stanford, you should really discuss the problem with Amir Dembo, who masters large deviations and random matrices!

    Best.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Syntax · Style · .