Press "Enter" to skip to content

Category: Uncategorized

Analysis, probability, statistics

Each journal article in MathSciNet is tagged with one or more MSC classification numbers. Here are the graphics for few of them around analysis, probability, and statistics (this is not exclusive since an article can contain multiple numbers).

Same kind of graphics with more data (beware that the colors are not the same):

Leave a Comment

Publications

I have learned recently from my colleague Patrick Cattiaux via Michel Ledoux that is it very easy to get from MathSciNet (MSN) the total number of publications per year. Actually I already knew that from my colleague Jean Dolbeault. It is tempting to play with this data. Here are the graphics obtained with few lines of scripting, for journal articles, books, and proceedings. The numbers depend on the selection of titles made by the Mathematical Reviews, which may vary in time, and which can be seen as a good definition of mathematics at large.

Now the same with the Zentralblatt MATH  (ZBM) database:

2 Comments

Projections

This post is about a remarkable projective property of certain Boltzmann-Gibbs measures.

The model. Let $d\geq1$ and $g:\mathbb{R}^d\to(-\infty,+\infty]$ be continuous with $g(x)<\infty$ for all $x\neq0$. For all $\beta>0$, $n\geq2$, let $P_n$ be the probability measure on $(\mathbb{R}^d)^n$ with density proportional to $$(x_1,\ldots,x_n)\in(\mathbb{R}^d)^n\mapsto\exp(-\beta H(x_1,\ldots,x_n))$$ where $$H(x_1,\ldots,x_n)=\sum_{i=1}^n\frac{1}{2}|x_i|^2+\sum_{i\neq j}g(x_i-x_j).$$ Let $$X=(X_{n,1},\ldots,X_{n,n})\sim P_n.$$

A projection. Let $p:\mathbb{R}^d\to\mathbb{R}^d$ be an orthogonal projection on a subspace $E\subset\mathbb{R}^d$. Let $\pi$ and $\pi^\perp$ be the orthogonal projections on the subspaces $$L=\{(p(z),\ldots,p(z))\in(\mathbb{R}^d)^n:z\in\mathbb{R}^d\}\quad\text{and}\quad L^\perp.$$ We have, for all $x\in(\mathbb{R}^d)^n$,

$$\pi(x)=(p(s(x)),\ldots,p(s(x)))\quad\text{where}\quad s(x):=\frac{x_1+\cdots+x_n}{n}\in\mathbb{R}^d.$$

Indeed, we have
$$
L^\perp=\{(x_1,\ldots,x_n)\in(\mathbb{R}^d)^n:(x_1+\cdots+x_n)\cdot
p(z)=0\text{ for all }z\in\mathbb{R}^d\},
$$
and for all $x=(x_1,\ldots,x_n)\in(\mathbb{R}^d)^n$ and all $(z,\ldots,z)\in L$, $z\in E$, we have
\begin{align}((x_1,\ldots,x_n)-(p(s(x)),\ldots,p(s(x))))\cdot(z,\ldots,z) &=\sum_{i=1}^nx_i\cdot z-np(s(x))\cdot z\\&=\sum_{i=1}^nx_i\cdot z-\sum_{i=1}^np(x_i)\cdot z\\&=\sum_{i=1}^n(x_i-p(x_i))\cdot z\\&=\sum_{i=1}^n0=0.\end{align} 

Example 1: if $E=\mathbb{R}^d$ then $p(x)=x$.

Example 2 : $E=\mathbb{R}z$ for $z\in\mathbb{R}^d$ with $|z|=1$. Then $p(x)=(x\cdot z)z$.​​

The statement.  If all the ingredients are as above, then:

  • $\pi(X)$ and $\pi^\perp(X)$ are independent random vectors;
  • $\pi(X)$ is Gaussian with law $\mathcal{N}(0,\frac{1}{\beta}I_{\mathrm{dim}(E)})$ in an orthonormal basis of $L$
  • $\pi^\perp(X)$ has law of density proportional to $x\in L^\perp\mapsto\mathrm{e}^{-\beta H(x)}$ with respect to the trace of the Lebesgue measure on the linear subspace $L^\perp$ of $(\mathbb{R}^d)^{n-1}$.

A proof. For all $x\in(\mathbb{R}^d)^n$, from $x=\pi(x)+\pi^\perp(x)$ we get
$$|x|^2=|\pi(x)|^2+|\pi^\perp(x)|^2$$
and on the other hand, for all $i,j\in{1,\ldots,n}$,

\begin{align}x_i-x_j&=\pi(x)i+\pi^\perp(x)_i-\pi(x)_j-\pi^\perp(x)_j\\&=s(x)+\pi^\perp(x)_i-s(x)-\pi^\perp(x)_j\\&=\pi^\perp(x)_i-\pi^\perp(x)_j.\end{align}

Since $V(x)=|x|^2$, it follows that for all $x=(x_1,\ldots,x_n)\in(\mathbb{R}^d)^n$, $$H(x)=|x|^2+\sum_{i\neq j}W(x_i-x_j)=|\pi(x)|^2+H(\pi^\perp(x)).$$

Let $u_1,\ldots,u_{dn}$ be an orthogonal basis of $(\mathbb{R}^d)^n=\mathbb{R}^{dn}$ such that $u_1,\ldots,u_{\mathrm{dim}(E)}$ is an orthonormal basis of $L$. For all $x\in(\mathbb{R}^d)^n$ we write $x=\sum_{i=1}^{dn}t_i(x)u_i$. We have $$\pi(x)=\sum_{i=1}^{\mathrm{dim}(E)}t_i(x)u_i\quad\text{and}\quad\pi^\perp(x)=\sum_{i=d+1}^{dn}t_i(x)u_i.$$ For all bounded measurable $f:L\to\mathbb{R}$ and $g:L^\perp\to\mathbb{R}$,
\begin{align}\mathbb{E}(f(\pi(X))g(\pi^\perp(X))) &=Z^{-1}\int_{(\mathbb{R}^d)^n}f(\pi(x))g(\pi^\perp(x))\mathrm{e}^{-\beta|\pi(x)|^2}\mathrm{e}^{-\beta H(\pi^\perp(x))}\mathrm{d}x_1\cdots\mathrm{d}x_n\\&=Z^{-1}\Bigr(\int_{\mathbb{R}^{\mathrm{dim}(E)}}f(t’)\mathrm{e}^{-\beta|t’|^2}\mathrm{d}t’\Bigr) \Bigr(\int_{\mathbb{R}^{d(n-1)}}g(t”) \mathrm{e}^{-\beta H(t”)}\mathrm{d}t”\Bigr)\end{align}

where $t’:=\sum_{i=1}^{\mathrm{dim}(E)}t_iu_i$, $\mathrm{d}t’:=\prod_{i=1}^{\mathrm{dim}(E)}\mathrm{d}t_i$, $t”:=\sum_{i=d+1}^{dn}t_iu_i$,$\mathrm{d}t”:=\prod_{i=d+1}^{dn}\mathrm{d}t_i$.

Further reading. arXiv:1805.00708

 

Leave a Comment
Syntax · Style · Tracking & Privacy.