Suppose that we would like to describe mathematically the convergence of a sequence ${(X_n)}_n$ of random variables towards a limiting random variable $X_\infty$, as $n\to\infty$. We have to select a notion of convergence. If we decide to use almost sure convergence, we need to define all the $X_n$’s as well as the limit $X_\infty$ on a common probability space in order to give a meaning to $$\mathbb{P}(\lim_{n\to\infty}X_n=X_\infty)=1.$$ This means that we need to **couple** the random variables. If we decide to use convergence in probability or in $L^p$, we have to define, for all $n$, both $X_n$ and $X_\infty$ in the same probability space in order to give a meaning to $\mathbb{P}(|X_n-X_\infty|>\varepsilon)$ and $\mathbb{E}(|X_n-X_\infty|^p)$ respectively, and therefore we end up to define all the $X_n$’s as well as $X_\infty$ on a common probability space. However, if we decide to use convergence in law (i.e. in distribution), then we do not need at all to define the random variables on a common probability space.

In the special case where $X_\infty$ is deterministic, the convergence in probability or in $L^p$ no longer impose to define the random variables on the same probability space. However, the almost sure convergence still requires the same probability space. Moreover if we impose that the almost sure convergence holds regardless of the way we define the random variables on the same probability space (i.e. for arbitrary couplings), then we end up with the important notion of **complete convergence**, which is equivalent, thanks to Borel-Cantelli lemmas, to a summable convergence in probability. Note that when the limit is deterministic, we also know that the convergence in law is equivalent to the convergence in probability. Moreover, we know in general from the Borel-Cantelli lemma that a summable convergence in probability implies almost sure convergence. Furthermore, the convergence in probability becomes easily summable under moment conditions.

Following Hsu & Robbins, if we consider $X_n=\frac{1}{n}(Z_1+\cdots+Z_n)$ where $Z_1,\ldots,Z_n$ are independent copies of some $Z$ of mean $m$, then the sequence ${(X_n)}_n$ converges **completely** towards $m$ as soon as $Z$ has a **finite second moment**, and this condition is almost necessary. This sheds an interesting light on the **law of large numbers for triangular arrays**.

Some people refuse to consider the almost sure convergence as a true mode of convergence in the sense that it is not associated to a metric, contrary to the other modes of convergence. In some sense, it appears as a critical notion in the law of large numbers, when we lower the concentration typically via integrability (moments conditions). Of course there are plenty of concrete situations for instance with martingales in which the coupling is in fact imposed and for which the almost sure convergence towards a non-constant random variable holds very naturally. A famous example is for instance the one of Pólya urns and of Galton-Watson branching processes. The Marchenko-Pastur theorem in random matrix theory provides an example of natural coupling with a limiting object which is deteterministic, and the convergence is complete via concentration of measure provided that the ingredients have enough finite moments.

**Note.** The idea of writing this tiny post came from a discussion with my friend Adrien Hardy.