Loading [MathJax]/jax/output/CommonHTML/jax.js
Press "Enter" to skip to content

Libres pensées d'un mathématicien ordinaire Posts

Two basic probabilistic proofs

Johan Jensen (mathematician)

I enjoy basic beautiful mathematical proofs. I see them like small jewels, that I collect from time to time. In this spirit, this post proposes probabilistic proofs of a couple of basic results.

Jensen inequality. The Jensen inequality states that if X is an integrable random variable on Rn and φ:RnR a convex function such that φ(X) is integrable, then

φ(E(X))E(φ(X)).

To prove it, we start by using the convexity of φ, which gives, for every integer n1 and every sequence x1,,xn in Rn,

φ(x1++xnn)φ(x1)++φ(xn)n.

Now, we use the integrability of X and φ(X): we take x1,x2, random independent and distributed as X, we use the strong law of large numbers for both sides, the fact that φ is continuous for the left hand side, and the fact that if P(A)=P(B)=1 then AB. I also appreciate the proof based on the equality for affine functions, the variational expression of a convex function as the envelope of its tangent hyperplanes, together with the fact that the supremum of expectations is less than or equal to the expectation of the supremum.

Schur-Hadamard product and cone of positive matrices. The Schur-Hadamard product of two square matrices A,BMn(R) is the matrix ABMn(R) defined by

(AB)ij:=AijBij

for evey 1i,jn. This entrywise product is denoted .* in Matlab/Octave/Freemat/Scilab.

Obviously, the Schur-Hadamard product preserves the cone of symmetric matrices. It is however not obvious that if A and B are symmetric positive semidefinite (i.e. non negative spectrum) then AB is also symmetric positive semidefinite.

To prove this remarkable statement, let us recall that the set of symmetric positive semidefinite matrices coincides with the set of covariance matrices of random vectors (and even of Gaussian random vectors). Next, let us consider two independent centered random vectors X and Y of Rn with respective covariance matrices A and B. Now, the random vector Z=XY of Rn defined by Zi:=XiYi for every 1in has covariance matrix AB, which is thus necessarily symmetric positive semidefinite! Any simpler proof?

4 Comments
Syntax · Style · .