This short post is devoted to one of these beautiful elementary facts, which can be found in a paper by Alfréd Rényi (1921 – 1970) entitled On the theory of order statistics published in Acta Math. Acad. Sci. Hungar. 4, (1953) 191-231, dedicated to A. N. Kolmogorov on the occasion of his fiftieth birthday. It seems that it can also be found in a paper by Benjamin Epstein and Milton Sobel entitled Life testing published in J. Amer. Statist. Assoc. 48 (1953) 486-502.

The identity. Let ${E_1,\ldots,E_n}$ be i.i.d. real random variables following an exponential distribution on ${[0,\infty)}$. Since ${(E_1,\ldots,E_n)}$ has a density, it follows that with probability one, the random variables ${E_1,\ldots,E_n}$ are pairwise distinct, and therefore there exists a unique random permutation ${\sigma\in S_n}$ such that ${E_{\sigma(1)}<\cdots< E_{\sigma(n)}}$. Let us write ${E_{(k)}:=E_{\sigma(k)}}$. Note that ${E_{(1)}=\min(E_1,\ldots,E_n)}$ and ${E_{(n)}=\max(E_1,\ldots,E_n)}$. We have the following beautiful identity in distribution:

$(E_{(1)},\ldots,E_{(n)}) \overset{d}{=} \left(\frac{E_1}{n},\cdots,\frac{E_1}{n}+\cdots+\frac{E_n}{1}\right).$

In particular ${E_{(k)}\overset{d}{=}\frac{E_1}{n}+\cdots+\frac{E_{k}}{n-k+1}}$ for every ${1\leq k\leq n}$. To see it, for every bounded and measurable test function ${\varphi:\mathbb{R}^n\rightarrow\mathbb{R}}$, we get, using changes of variables,

$\begin{array}{rcl} \mathbb{E}(\varphi(E_{(1)},\ldots,E_{(n)})) &=&\int_{[0,\infty)^n}\!\varphi(x_{(1)},\ldots,x_{(n)})\, \lambda^ne^{-\lambda(x_1+\cdots+x_n)}\,dx_1\cdots dx_n\\ &=&n!\int_{0\leq y_1<\cdots<y_n}\!\varphi(y_{1},\ldots,y_{n})\, \lambda^ne^{-\lambda(y_1+\cdots+y_n)}\,dy_1\cdots dy_n\\ &=&\int_{[0,\infty)^n}\!\varphi(z_1,\ldots,z_1+\cdots+z_n)\, n!\lambda^ne^{-\lambda(nz_1+\cdots+1z_n)}\,dz_1\cdots dz_n. \end{array}$

It follows that the real random variables ${E_{(1)},E_{(2)}-E_{(1)},\ldots,E_{(n)}-E_{(n-1)}}$ are independent with ${E_{(k)}-E_{(k-1)}}$ exponential with mean ${1/((n-k+1)\lambda)}$.

The identity ${E_{(1)}\overset{d}{=}\frac{1}{n}E_1}$ is a special case of the more general identity

$\min\left(\frac{E_1}{\lambda_1},\ldots,\frac{E_n}{\lambda_n}\right) \overset{d}{=}\frac{E_1}{\lambda_1+\cdots+\lambda_n},$

valid for any ${\lambda_1>0,\ldots,\lambda_n>0}$, which plays a fundamental role in the structure of Markov processes. With probability one the ${\min}$ is achieved at a unique random integer ${N}$ in ${\{1,\ldots,n\}}$ independent of the value of the minimum and distributed according to the discrete law ${\mathbb{P}(N=k)=\lambda_k/(\lambda_1+\cdots+\lambda_n)}$ for every ${1\leq k\leq n}$.

The identity ${E_{(n)}\overset{d}{=}\frac{E_1}{n}+\cdots+\frac{E_n}{1}}$ appears in the analysis of the jump times of the continuous time regular branching tree called the Yule process, and in the analysis of the more recent common ancestor in the genealogy of the Fisher-Wright model.

To go a bit further, if ${F_E}$ is the cumulative distribution function of the ${E_i}$’s, which is exponential: ${F_E(x)=(1-e^{-\lambda x})\mathbf{1}_{[0,\infty)}(x)}$, and if ${X_1,\ldots,X_n}$ are i.i.d. with continuous cumulative distribution function ${F_X}$, then, by monotonicity,

$(X_{(1)},\ldots,X_{(n)}) \overset{d}{=} \left((F_X^{-1}\circ F_E)\left(\frac{E_1}{n}\right), \ldots,(F_X^{-1}\circ F_E)\left(\frac{E_1}{n}+\cdots+\frac{E_n}{1}\right)\right).$

Link with Dirichlet distribution. Actually, it is more customary to express order statistics in terms of i.i.d. uniforms rather than in terms of i.i.d. exponentials. Order statistics of uniforms are linked with Dirichlet distributions. Namely, the random variables ${U_1:=F_X(X_1),\ldots,U_n:=F_X(X_n)}$ are i.i.d. uniform on ${[0,1]}$, and therefore it is well known that ${(U_{(1)},U_{(2)}-U_{(1)},\ldots,1-U_{(n)})}$ follows a Dirichlet distribution on ${[0,1]}$ of size ${n}$ and parameter ${(1,\ldots,1)}$, and thus, by monotonicity,

$(F_X(X_{(1)}),\ldots,F_X(X_{(n)})) \overset{d}{=} \frac{(E_1,\ldots,E_1+\cdots+E_n)}{E_1+\cdots+E_{n+1}}.$

In particular, if the ${X_i}$’s are uniform on ${[0,1]}$ then we get the identity in distribution

$\left(F_E\left(\frac{E_1}{n}\right), \ldots,F_E\left(\frac{E_1}{n}+\cdots+\frac{E_n}{1}\right)\right) \overset{d}{=} \frac{(E_1,\ldots,E_1+\cdots+E_n)}{E_1+\cdots+E_{n+1}}.$

In particular, for every ${1\leq k\leq n}$, the random variable ${F_E(E_1/n+\cdots+E_k/(n-k+1))}$ is identical in distribution to the random variable ${(E_1+\cdots+E_k)/(E_1+\cdots+E_{n+1})}$, a marginal of a Dirichlet distribution, which follows the Beta distribution on ${[0,1]}$ of parameter ${(1,n-1)}$. All these funny identities in distribution constitute actually the tip of an iceberg dealing with Beta, Gamma, Dirichlet distributions and Poisson point processes. They are useful for instance for the analysis of Kolmogorov-Smirnov type statistics, which constitute actually the main motivation of the paper of Rényi. This is also useful in many other situations. One may take a look for instance at the paper by Raoul LePage, Michael Woodroofe, and Joel Zinn, entitled Convergence to a stable distribution via order statistics and published in Ann. Probab. 9 (1981) 624-632. We used the approach with Charles Bordenave and Pietro Caputo for our analysis of random Markov chains models in arXiv:0903.3528 and published in Ann. Probab. 39(4) (2011) 1544-1590.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Syntax · Style · Tracking & Privacy.