# Libres pensées d'un mathématicien ordinaire Posts

L’étude macroscopique, statique ou dynamique, des comportements collectifs d’individus à partir de la description de leurs interactions microscopiques constitue un vaste sujet à l’interface entre la biologie, la physique, l’informatique, les mathématiques, et peut-être même les sciences sociales. En physique, la stabilité des étoiles et des trous noirs de l’astronomie, les systèmes de spins de la mécanique statistique, ainsi que la cinétique des gas et des plasmas de la physique statistique en constituent des exemples frappants. En biologie, le comportement des fourmis, des abeilles, des bancs de poissons, des vols d’oiseau, des feux de forêts, des colonies sédentaires, et des foules humaines est également à ranger dans cette catégorie. En informatique, les réseau pairs à pairs, les processus exécutés par un processeur, leurs occupation en mémoire, ou plus simplement les réseaux de machines constituent autant de dynamiques individuelles en interactions. Sur le plan mathématique, différents types de modélisations ont été développés, allant d’équations aux dérivées partielles, avec ou sans bruit, aux processus de Markov discrets ou continus, en passant par les automates cellulaires. Les modèles les plus difficiles sont ceux qui font intervenir des interactions spatialisées plutôt qu’échangeables. Le passage du microscopique au macroscopique correspond souvent à un passage du discret au continu. L’exemple le plus naturel est celui du théorème central limite pour la marche aléatoire simple qui fait apparaître le mouvement Brownien (principe d’invariance de Donsker). Les limites fluides et hydrodynamiques sont reliées à cet exemple.

The work of Boltzmann on entropy in the years 1865-1905 is really amazing. Beyond important combinatorial aspects, one of the general ideas behind his work is that along certain dynamics, some functional is monotonic, and thus, the long time equilibrium, if it exists, is related to the optimum of the functional over the constraints related to the conservation laws. For the original Boltzmann equation $\partial_tf_t=A(f_t)$ which comes from kinetic gases modelling, the entropy is $H(f)=-\int\!f(x)\log f(x)dx$, and is maximized by Gaussians under a variance constraint. Here the variance constraint corresponds to the convervation law of the energy. One may call entropies such functionals. Boltzmann was the first to use a partial differential equation to describe the evolution of a probability density function, dozens of years before the rigorous analysis of such concepts in mathematics.

Of course, for nonlinear dynamics, the initial data may play a subtle role. The same idea is present in the notion of gradient flow equations. Beyond statistical physics, the maximum entropy principle plays a role in Bayesian statistics. It has also something to do with the consistency of the maximum likelihood estimator.

For an ergodic and reversible Markov process, the equilibrium is typically a Gibbs measure, and the free energy plays the role of the entropy and is monotonic. A Gibbs measure is a maximum entropy under an averaged energy constraint involving the potential of the Gibbs measure. The monotonicity does not contradict the reversibility, because reversibility is a property of the equilibrium, and has nothing to do with the initial data.

Another interesting problem involving entropy and monotonicity emerged from information theory and was stated by Shannon: does the entropy of Boltzmann is monotonic along the standard central limit theorem? How about the speed? Here the dynamics is related to independence and convolution and the conservation law is the variance. This problem was solved dozens of years later by many authors including Artstein, Ball, Barthe and Naor. The central limit theorem is available in many contexts beyond the classical Abelian case, including the Voiculescu operators algebras of free probability (Shlyakhtenko has solved positively the problem) and Lie groups. It is tempting to formulate the Shannon conjecture on (non-compact) Lie groups with dilation. The answer in unknown, even for the Heisenberg group.

Let $(X_{jk})_{j,k\geq1}$ be an infinite table of i.i.d. complex random variables with positive variance and all moments bounded by a constant. Set $M:=(X_{jk})_{1\leq j,k\leq n}$. I do believe that with probability one and in expectation, the $*$-moments of $M$ converge as $n\to\infty$ to the $*$-moments of the Voiculescu circular element. This result is well known when $X_{11}$ is Gaussian. There is maybe a proof written somewhere, involving some paths combinatorics. Can you link the question with the work of Nourdin and Peccati?

If it holds, this statement shows that the hypothesis in the Śniady theorem is always satisfied by these random matrices. Moreover, the regularization by a Ginibre Ensemble is not needed thanks to the Tao and Vu bound on the smallest singular values. Well, TT will probably say a word on this in his recent course.

Have you heard about the Nonnegative Matrix Factorization (NMF)? It has some similarities with the Principal Component Analysis (PCA) and belongs to the rank or dimension reduction techniques. The NMF incorporates a sign constraint, and is not based on the Singular Value Decomposition (SVD). It can be used with a Bregman entropic pseudo-distance. I have heard about NMF during a talk on speach recognition in INRIA Bordeaux I. It seems that NMF has some success in various domains of applications. It is quite natural to think about some adaptive sparse NMF, an analogue of the sparse PCA.

Syntax · Style · .