Press "Enter" to skip to content

Back to basics – A moment with moments

 

Teaching

 

I like very much the following elementary observation, inspired from a lemma taken from an article by Kabluchko and Zaporozhets. Let $X_1,X_2,\ldots$ be independent and identically distributed random variables, and let $\Phi:\mathbb{R}_+\to\mathbb{R}_+$ be some increasing function with (increasing) inverse $\Phi^{-1}:\mathbb{R}_+\to\mathbb{R}_+$. Examples: $\Phi(t)=t^p$,  or $\Phi(t)=\log(1+t)$. Then

$$\mathbb{E}(\Phi(|X_1|))<\infty \quad\Leftrightarrow\quad \mathbb{P}\left(\frac{|X_n|}{\Phi^{-1}(n)}<1\mbox{ for } n\gg1\right)=1.$$

To see it, recall that $\sum_{n\geq1}\mathbf{1}_{n\leq y} =\lfloor y\rfloor\leq y\leq1+\lfloor y\rfloor=\sum_{n=0}^\infty\mathbf{1}_{n\leq y}$ for every non-negative real number $y\geq0$, and thus, if $Y\geq0$ is a non-negative random variable then

$$\sum_{n=1}^\infty\mathbb{P}(Y\geq n)\leq\mathbb{E}(Y)\leq\sum_{n=0}^\infty\mathbb{P}(Y\geq n).$$

Used with $Y=\Phi(|X_1|)$, we obtain,

$$\mathbb{E}(\Phi(|X_1|)<\infty\quad\Leftrightarrow\quad\sum_{n=1}^\infty\mathbb{P}(|X_1|\geq \Phi^{-1}(n))<\infty.$$

Now $X_1\overset{d}{=}X_n$, and since $X_1,X_2,\ldots $ are independent, the Borel-Cantelli lemmas give

$$\sum_{n\geq1}\mathbb{P}(|X_n|\geq\Phi^{-1}(n))<\infty\quad\Leftrightarrow\quad\mathbb{P}\left(\frac{|X_n|}{\Phi^{-1}(n)}<1\mbox{ for }n\gg1\right)=1.$$

    Leave a Reply

    Your email address will not be published.

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    Syntax · Style · .