Press "Enter" to skip to content

Month: October 2020

Back to basics : local martingales

A martingale.

This post is inspired from the exam of my master course on stochastic calculus. The processes considered in this post are in continuous time, defined on a filtered probability space \( {(\Omega,\mathcal{F},{(\mathcal{F}_t)}_{t\geq0},\mathbb{P})} \), adapted, and have almost surely continuous trajectories.

Local martingales. If \( {{(M_t)}_{t\geq0}} \) is a martingale, the Doob stopping theorem states that if \( {T} \) is a stopping time, then the stopped process \( {{(M_{t\wedge T})}_{t\geq0}} \) is again a martingale.

Stopping can be used in general to truncate the trajectories of a process with a cutoff, in order to gain more integrability or tightness, while keeping adaptation and continuity. Typically if \( {{(X_t)}_{t\geq0}} \) is an adapted process, we could consider the sequence of stopping times \( {{(T_n)}_{n\geq0}} \) defined by

\[ T_n=\inf\{t\geq0:|X_t|\geq n\}, \]

which satisfies almost surely \( {T_n\nearrow+\infty} \) as \( {n\rightarrow\infty} \) and for which for all \( {n} \) the stopped process \( {{(X_{t\wedge T_n})}_{t\geq0}} \) is bounded by \( {|X_0|\vee n} \). Since \( {X} \) is continuous, almost surely, for all \( {t\geq0} \), \( {\lim_{n\rightarrow\infty}X_{t\wedge T_n}=X_t} \). We say that \( {{(T_n)}_{n\geq0}} \) is a localizing sequence.

Now a local martingale is simply an adapted processes \( {{(X_t)}_{t\geq0}} \) such that for all \( {n\geq0} \) the stopped process \( {{(X_{t\wedge T_n})}_{t\geq0}} \) is a (bounded) martingale.

Every martingale is a local martingale. However the converse is false, and strict local martingales do exist. We give below one of the most famous example. Local martingales also popup naturally in the construction of the Itô stochastic integral. We give below a simple example of a stochastic integral which is a strict local martingale.

Domination. If \( {{(X_t)}_{t\geq0}} \) is a local martingale which is dominated by an integrable random variable, in the sense that \( {\mathbb{E}\sup_{t\geq0}|X_t|<\infty} \), then \( {{(X_t)}_{t\geq0}} \) is a martingale, and in fact a uniformly integrable martingale. Namely, for all \( {t\geq0} \) and all \( {s\in[0,t]} \), by dominated convergence used twice and the martingale property for the stopped process,

\[ \begin{array}{rcl} \mathbb{E}(X_s) &=&\mathbb{E}(\lim_{n\rightarrow\infty}X_{s\wedge T_n})\\ &=&\lim_{n\rightarrow\infty}\mathbb{E}(X_{s\wedge T_n})\\ &=&\lim_{n\rightarrow\infty}\mathbb{E}(X_{t\wedge T_n}\mid\mathcal{F}_s)\\ &=&\mathbb{E}(\lim_{n\rightarrow\infty}X_{t\wedge T_n}\mid\mathcal{F}_s)\\ &=&\mathbb{E}(X_t\mid\mathcal{F}_s). \end{array} \]

Therefore strict local martingales are not dominated, their supremum is not integrable. However strict local martingales can be uniformly integrable, and even bounded in \( {\mathrm{L}^2} \).

A strict local martingale bounded in \( {\mathrm{L}^2} \). Let \( {{(B_t)}_{t\geq0}} \) be a standard Brownian motion in \( {\mathbb{R}^3} \) issued from \( {x\in\mathbb{R}^3} \) with \( {x\neq0} \). Then the inverse Bessel process \( {{(|B_t|^{-1})}_{t\geq0}} \) is a well defined local martingale, bounded in \( {\mathrm{L}^2} \), but is not a martingale.

Proof. Our first goal is to show that the process \( {{(|B_t|^{-1})}_{t\geq0}} \) is well defined, namely that \( {B} \) takes its values in \( {D=\{x\in\mathbb{R}^3:x\neq0\}} \). For that we consider \( {0<r<|x|} \), and we define

\[ T_r=\inf\{t\geq0:|x+B_t|=r\}. \]

The stopped process \( {{(X_t)}_{t\geq0}={(B_{t\wedge T_r})}_{t\geq0}} \) takes its values in the open set \( {D} \). Now, the function \( {y\in D\mapsto u(y)=|y|^{-1}} \) is harmonic, namely \( {\Delta u=0} \) since

\[ \partial_iu(y)=-\frac{y_i}{|y|^3}, \quad\text{and}\quad \partial^2_{i,i}u(y)=\frac{3}{2}\frac{|y|^3-3y_i^2|y|}{|y|^{5}}. \]

Also, by the Itô formula, for all \( {t\geq0} \), using \( {\langle X^j,X^k\rangle_t=\langle B^j,B^k\rangle_{t\wedge T_r}=(t\wedge T_r)\mathbf{1}_{j=k}} \),

\[ u(X_t)=u(X_0)+\int_0^t\nabla u(X_s)\mathrm{d}X_s+\frac{1}{2}\int_0^{t\wedge T_r}\Delta u(X_s)\mathrm{d}s. \]

The last integral vanishes because \( {\Delta u=0} \) on \( {D} \), hence

\[ {(u(X_t))}_{t\geq0}={(|B_{t\wedge T_r}|^{-1})}_{t\geq0} \]

is a local martingale, bounded by the constant \( {r^{-1}} \), thus it is a bounded martingale.

Let us compute now \( {\mathbb{P}(T_r<T_R)} \) for \( {0<r<|x|<R} \). Since a \( {1} \)-dimensional Brownian motion almost surely escapes from every finite interval, the first component of our \( {3} \)-dimensional Brownian motion started from \( {x} \) almost surely escapes from \( {[-R,R]} \), and it follows that almost surely \( {T_R<\infty} \). In particular almost surely either \( {T_r<T_R} \) or \( {T_r>T_R} \) and we cannot have \( {T_r=T_R} \). Next, we have first the immediate equation

\[ 1=\mathbb{P}(T_r<T_R)+\mathbb{P}(T_r>T_R). \]

On the other hand, by the Doob stopping theorem for the bounded martingale \( {Y={(|B_{t\wedge T_r}|^{-1})}_{t\geq0}} \) and the finite stopping time \( {T_R} \), we get the new equation

\[ \frac{1}{|x|} =\mathbb{E}(Y_0) =\mathbb{E}(Y_{T_R}) =\mathbb{E}\left(\frac{1}{|B_{T_r\wedge T_R}|}\right) =\frac{\mathbb{P}(T_r<T_R)}{r}+\frac{\mathbb{P}(T_r>T_R)}{R}. \]

Solving this couple of equations gives

\[ \mathbb{P}(T_r<T_R)=\frac{R^{-1}-|x|^{-1}}{R^{-1}-r^{-1}}. \]

Now \( {T_r<T_R} \) if \( {R>\sup_{s\in[0,B_{T_r}]}|B_s|} \), hence \( {\{T_r<T_R\}\underset{R\rightarrow\infty}{\nearrow}\{T_r<\infty\}} \). It follows that

\[ \mathbb{P}(T_r<T_R) \underset{R\nearrow\infty}{\nearrow} \mathbb{P}(T_r<\infty) \]

and thus, from the formula above,

\[ \mathbb{P}(T_r<\infty) =\lim_{R\rightarrow\infty}\mathbb{P}(T_r<T_R) =\frac{|x|^{-1}}{r^{-1}}=\frac{r}{|x|}. \]

Now almost surely \( {B} \) is continuous and therefore \( {\{T_r<\infty\}\underset{r\searrow0^+}{\searrow}\{T_0<\infty\}} \) and thus

\[ \mathbb{P}(T_0<\infty) =\lim_{r\rightarrow0^+}\mathbb{P}(T_r<\infty) =\lim_{r\rightarrow0^+}\frac{r}{|x|}=0. \]

Therefore \( {B} \) takes its values in \( {D} \), and the process \( {{(|B_t|^{-1})}_{t\geq0}} \) is well defined. This process is also adapted. It is a local martingale, localized by \( {T_r} \) with \( {r\in\{(|x|+n)^{-1}:n\geq1\}} \).

Let us show that \( {\lim_{t\rightarrow\infty}|B_t|=+\infty} \) almost surely. This is typical to dimension \( {d\geq3} \), related to transcience of Brownian motion. Indeed, since \( {{(|B_t|^{-1})}_{t\geq0}} \) is a non-negative local martingale, it is a super-martingale. This is easily seen by using a localization sequence and the Fatou lemma. It follows that almost surely it converges as \( {t\rightarrow\infty} \) to an integrable random variable, hence, almost surely \( {\lim_{t\rightarrow\infty}|B_t|} \) exists in \( {[0,+\infty]} \), and the convergence holds also in law and the limiting law can only be \( {\delta_\infty} \).

Let us show now that \( {{(|B_t|^{-1})}_{t\geq0}} \) is bounded in \( {\mathrm{L}^2} \). By rotational invariance and scaling of Brownian motion, we can assume without loss of generality that \( {x=(1,0,0)} \). Since \( {B_t\sim\mathcal{N}(x,tI_3)} \), using spherical coordinates

\[ y_1=r\cos(\theta)\sin(\varphi),\quad y_2=r\sin(\theta)\sin(\varphi),\quad y_3=r\cos(\varphi) \]

with \( {r\in[0,\infty)} \), \( {\theta\in[0,2\pi)} \), \( {\varphi\in[0,\pi)} \), we have \( {\mathrm{d}y=r^2\sin(\varphi)\mathrm{d}r\mathrm{d}\theta\mathrm{d}\varphi} \), and for all \( {t>0} \),

\[ \begin{array}{rcl} \mathbb{E}(|B_t|^{-2}) &=&(2\pi t)^{-3/2}\int_{\mathbb{R}^3} \left|y\right|^{-2}\mathrm{e}^{-\frac{y_1^2+y_2^2+(y_3-1)^2}{2t}}\mathrm{d}y\\ &=&(2\pi t)^{-3/2}\int_0^\infty\int_0^{2\pi}\int_0^\pi r^{-2}\mathrm{e}^{-\frac{r^2\sin(\varphi)^2+(r\cos(\varphi)-1)^2}{2t}} r^2\sin(\varphi)\mathrm{d}r\mathrm{d}\theta\mathrm{d}\varphi\\ &=&(2\pi)^{-1/2}t^{-3/2}\int_0^\infty\int_0^\pi \mathrm{e}^{-\frac{r^2\sin(\varphi)^2+(r\cos(\varphi)-1)^2}{2t}} \sin(\varphi)\mathrm{d}r\mathrm{d}\varphi\\ &=&(2\pi)^{-1/2}t^{-3/2}\int_0^\infty\int_0^\pi \mathrm{e}^{-\frac{r^2-2r\cos(\varphi)+1}{2t}} \sin(\varphi)\mathrm{d}r\mathrm{d}\varphi\\ &=&(2\pi)^{-1/2}t^{-3/2}\mathrm{e}^{-\frac{1}{2t}}\int_0^\infty\mathrm{e}^{-\frac{r^2}{2t}}\Bigr(\int_{-1}^1 \mathrm{e}^{\frac{ru}{t}}\mathrm{d}u\Bigr)\mathrm{d}r\\ &=&(2\pi)^{-1/2}t^{-3/2}\mathrm{e}^{-\frac{1}{2t}}\int_0^\infty\mathrm{e}^{-\frac{r^2}{2t}}\Bigr[\frac{t}{r}\mathrm{e}^{\frac{ru}{t}}\Bigr]_{u=-1}^{u=1}\mathrm{d}r\\ &=&2(2\pi)^{-1/2}t^{-3/2}\mathrm{e}^{-\frac{1}{2t}}\int_0^\infty \mathrm{e}^{-\frac{r^2}{2t}}\frac{\sinh(\frac{r}{t})}{\frac{r}{t}}\mathrm{d}r\\ &=&2(2\pi)^{-1/2}t^{-3/2}\mathrm{e}^{-\frac{1}{2t}} \sum_{n=0}^\infty\frac{1}{(2n+1)!}\int_0^\infty\left(\frac{r}{t}\right)^{2n}\mathrm{e}^{-\frac{r^2}{2t}}\mathrm{d}r\\ &=&t^{-1}\mathrm{e}^{-\frac{1}{2t}}\sum_{n=0}^\infty\frac{t^{-2n}}{(2n+1)!}(2\pi t)^{-1/2}\int_{-\infty}^\infty r^{2n}\mathrm{e}^{-\frac{r^2}{2t}}\mathrm{d}r\\ &=&t^{-1}\mathrm{e}^{-\frac{1}{2t}}\sum_{n=0}^\infty\frac{t^{-2n}}{(2n+1)!}t^n\frac{(2n-1)!}{2^{n-1}(n-1)!}\\ &=&t^{-1}\mathrm{e}^{-\frac{1}{2t}}\sum_{n=0}^\infty\frac{(2t)^{-n}}{(2n+1)n!}\\ &=&2\mathrm{e}^{-\frac{1}{2t}} \sum_{n=0}^\infty\frac{(2t)^{-(n+1)}}{(2n+1)n!}\\ &\leq&2\mathrm{e}^{-\frac{1}{2t}} \sum_{n=0}^\infty\frac{(2t)^{-(n+1)}}{(n+1)!} =2\mathrm{e}^{-\frac{1}{2t}}(\mathrm{e}^{\frac{1}{2t}}-1)\leq2. \end{array} \]

Let us show that \( {{(Z_t)}_{t\geq0}={(|B_t|^{-1})}_{t\geq0}} \) is not a martingale by contradiction. Assume that it is a martingale. Since it is bounded in \( {\mathrm{L}^2} \), \( {\lim_{t\rightarrow\infty}Z_t=Z_\infty} \) almost surely and in \( {\mathrm{L}^1} \), with \( {Z_\infty\geq0} \) and \( {Z_\infty\in\mathrm{L}^1} \). Moreover \( {\mathbb{E}(Z_\infty)=\mathbb{E}(Z_0)=|x|^{-1}>0} \). But almost surely \( {\lim_{t\rightarrow\infty}|B_t|=+\infty} \), hence almost surely \( {Z_\infty=0} \), thus \( {\mathbb{E}(Z_\infty)=0} \), a contradiction.

Alternatively, we could use Doob stopping for u.i. martingales with the u.i. martingale \( {Z} \) and the finite stopping time \( {T_R} \), which gives \( {|x|^{-1}=\mathbb{E}(Z_0)=\mathbb{E}(Z_{T_R})=R^{-1}} \), a contradiction.

Alternatively, we could conduct explicit computations to show that \( {\mathbb{E}(Z_t)\searrow0} \) as \( {t\rightarrow\infty} \), which is thus yet another way to show that \( {{(Z_t)}_{t\geq0}} \) is not a martingale!

Stochastic differential equation. Actually \( {{(Z_t)}_{t\geq0}={(|B_t|^{-1})}_{t\geq}} \) solves

\[ Z_t=\frac{1}{|x|}-\int_0^tZ_s^2\mathrm{d}W_s. \]

Itô stochastic integrals. Let us give an example of an Itô stochastic integral which is a local martingale but not a martingale. Of course we could consider the trivial example \( {\int_0^t\mathrm{d}Z_s=Z_t-Z_0} \) where \( {{(Z_t)}_{t\geq0}={(|B_t|^{-1})}_{t\geq0}} \) is the strict local martingale considered previously, but a deeper understanding is expected here!

A more interesting idea relies on the stochastic integral

\[ I_B(\varphi)=\int_0^\bullet\varphi_s\mathrm{d}B_s \]

where \( {\varphi} \) is the single step function \( {\varphi=U\mathbf{1}_{(0,1]}} \) with \( {U} \) \( {\mathcal{F}_0} \)-measurable. A property of the Itô stochastic integral for semi-martingale integrators (here \( {B} \)) gives

\[ I_B(\varphi) =UB_{\bullet\wedge 1}-UB_0 =UB_{\bullet\wedge 1}. \]

Now if we take \( {U} \) independent of \( {B} \), then, in \( {[0,+\infty]} \),

\[ \mathbb{E}(|I_B(\varphi)_1|) =\mathbb{E}(|U|)\mathbb{E}(|B_1|). \]

Thus, if \( {U} \) is not integrable then \( {I_B(\varphi)_1} \) is not integrable, and \( {I_B(\varphi)} \) is not a martingale.

Last Updated on 2020-11-18

Leave a Comment

Convergence of discrete time martingales

Joseph Leo Doob
Joseph Leo Doob (1910 – 2004) as president of the AMS (1963 – 1964)

It is tempting to think that discrete time martingales are deeper and more elementary than continuous martingales, and that most of the statements on continuous martingales can be reduced by approximation to statements on discrete time martingales. But the truth is that some statements on continuous martingales can be proved with genuine continuous methods, which can be more elegant or more simple than discrete methods. The best for a probabilist is probably to be comfortable on both sides and to focus on the probabilistic intuition, contrary to pure analysts! Even if most of the physics of the phenomena is the same, there are specific aspects related to continuities and discontinuities and their links by passage to the limit, which cannot be reduced completely to technical aspects.

This post is a discrete time counterpart of a previous post on the almost sure convergence of martingales. The argument that we have used for a continuous martingale \( {{(M_t)}_{t\geq0}} \) with \( {M_0=0} \) involves that if for some threshold \( {R} \) we define \( {T=\inf\{t\geq0:|M_t|\geq R\}} \), then \( {|M_T|\leq R} \). Due to a possible jump at time \( {T} \), this is no longer valid when \( {M} \) is discontinuous. In particular, the argument is not valid for discrete time martingales.

In this post, we provide a proof of almost sure convergence of submartingales bounded in \( {\mathrm{L}^1} \), by reduction to the almost sure convergence of nonnegative supermartingales, itself reduced to the convergence of martingales bounded in \( {\mathrm{L}^2} \), which uses the Doob decomposition of adapted integrable processes as well as the Doob maximal inequality. We do not use the Doob stopping theorem (only the germ of it). What is remarkable here is that the whole approach is alternative to the classical proof from scratch with upcrossings which goes back to Joseph Leo Doob.

Submartingales bounded in \( {\mathrm{L}^1} \). If \( {{(X_n)}_{n\geq0}} \) is a submartingale bounded in \( {\mathrm{L}^1} \) then there exists \( {X_\infty\in\mathrm{L}^1} \) such that \( {\lim_{n\rightarrow\infty}X_n=X_\infty} \) almost surely.

Proof. The fact that \( {X_\infty\in\mathrm{L}^1} \) follows by the Fatou lemma since

\[ \mathbb{E}(|X_\infty|) =\mathbb{E}(\varliminf_n|X_n|) \leq\varliminf_n\mathbb{E}(|X_n|) \leq\sup_n\mathbb{E}(|X_n|)<\infty. \]

Set \( {C=\sup_n\mathbb{E}(|X_n|)<\infty} \). To get almost sure convergence it suffices to show that

\[ X=Y-Z \]

where \( {{(Y_n)}_{n\geq0}} \) and \( {{(Z_n)}_{n\geq0}} \) are both nonnegative supermartingales and to use the theorem of convergence for nonnegative supermartingales. Since \( {(\bullet)^+=\max(\bullet,0)} \) is convex and nondecreasing, \( {X_n^+=\max(X_n,0)} \) defines a submartingale. Let

\[ X_n^+=X_0^++M_n+A_n \]

be its Doob decomposition. We known that \( {0\leq A_n\nearrow A_\infty} \) as \( {n\rightarrow\infty} \) almost surely where \( {A_\infty} \) takes its values in \( {[0,+\infty]} \). But since \( {\mathbb{E}(A_n)=\mathbb{E}(X_n^+)-\mathbb{E}(X_0^+)\leq C} \), it follows by monotone convergence that \( {\mathbb{E}(A_\infty)\leq C} \). Let us define

\[ Y_n=X_0^++M_n+\mathbb{E}(A_\infty\mid\mathcal{F}_n). \]

The process \( {{(Y_n)}_{n\geq0}} \) is a martingale. It is nonnegative since

\[ Y_n\geq X_0^++M_n+A_n=X_n^+\geq0; \]

Finally \( {Z_n=Y_n-X_n} \) defines a submartingale as the difference of a martingale and a supermartingale and \( {Z_n\geq X_n^+-X_n=X_n^-\geq0} \).

Nonnegative supermartingales. If \( {{(X_n)}_{n\geq0}} \) is a nonnegative supermartingale then there exists \( {X_\infty} \) taking values in \( {[0,+\infty]} \) such that \( {\lim_{n\rightarrow\infty}X_n=X_\infty} \) almost surely.

Proof. Since \( {\mathrm{e}^{-\bullet}} \) is nonincreasing and convex, the Jensen inequality gives that \( {Y_n=\mathrm{e}^{-X_n}} \) defines a submartingale. Let us write its Doob decomposition

\[ Y_n=Y_0+M_n+A_n \]

where \( {M} \) is a martingale and \( {A} \) is nonnegative and predictable, and \( {M_0=A_0=0} \). We have \( {0\leq A_n\nearrow A_\infty} \) as \( {n\rightarrow\infty} \) almost surely where \( {A_\infty} \) takes its values in \( {[0,+\infty]} \). It suffices now to show that \( {M} \) is a martingale bounded in \( {\mathrm{L}^2} \) and to use the theorem about the convergence of martingales bounded in \( {\mathrm{L}^2} \). The martingale property gives, for all \( {n,m} \), denoting \( {\Delta M_k=M_k-M_{k-1}} \),

\[ \mathbb{E}((M_{n+m}-M_n)^2) =\sum_{k=n+1}^{n+m}\mathbb{E}((\Delta M_k)^2). \]

Let us write \( {Y_n^2=Y_0^2+\sum_{k=1}^n(Y_k^2-Y_{k-1}^2)} \). Since \( {Y_k=Y_{k-1}+\Delta M_k+\Delta A_k} \), we get

\[ Y_n^2=Y_0^2+\sum_{k=1}^n\left[(\Delta M_k)^2+(\Delta A_k)^2+2Y_{k-1}\Delta M_k+2Y_{k-1}\Delta A_k+2\Delta M_k\Delta A_k\right]. \]

Now \( {Y_0^2+\sum_k(\Delta A_k)^2\geq0} \) and \( {2\sum_kY_{k-1}\Delta A_k\geq0} \) since \( {Y\geq0} \) and \( {\Delta A\geq0} \). Thus

\[ \sum_{k=1}^n(\Delta M_k)^2 +2\sum_{k=1}^n(Y_{k-1}+\Delta A_k)\Delta M_k \leq Y_n^2\leq1. \]

At this step, we note that

\[ \mathbb{E}((Y_{k-1}+\Delta A_k)\Delta M_k) =\mathbb{E}((Y_{k-1}+\Delta A_k)\mathbb{E}(\Delta M_k\mid\mathcal{F}_{k-1})) =0. \]

It follows that \( {\mathbb{E}(M_n^2)=\mathbb{E}((M_n-M_0)^2)=\sum_{k=1}^n\mathbb{E}((\Delta M_k)^2)\leq1} \).

Martingales bounded in \( {\mathrm{L}^2} \). If \( {{(M_n)}_{n\geq0}} \) is a martingale bounded in \( {\mathrm{L}^2} \), then there exists \( {M_\infty\in\mathrm{L}^2} \) such that \( {\lim_{n\rightarrow\infty}M_n=M_\infty} \) almost surely and in \( {\mathrm{L}^2} \).

Proof. For all \( {n,m} \), for all \( {1\leq k<n} \), we have, denoting \( {\Delta M_k=M_k-M_{k-1}} \),

\[ \mathbb{E}(\Delta M_k\Delta M_n) =\mathbb{E}(\mathbb{E}(\Delta M_k\Delta M_n\mid\mathcal{F}_{n-1})) =\mathbb{E}(\Delta M_k\mathbb{E}(\Delta M_n\mid\mathcal{F}_{n-1})) =0. \]

This orthogonality of successive increments gives, for all \( {n,m\geq0} \),

\[ \mathbb{E}((M_{n+m}-M_n)^2) =\sum_{k=n+1}^{n+m}\mathbb{E}((\Delta M_k)^2). \]

In particular, since \( {\sup_{n\geq0}\mathbb{E}(M_n^2)<\infty} \), we get \( {\sup_{n\geq0}\mathbb{E}((M_n-M_0)^2)<\infty} \), and thus \( {\sum_{k\geq0}\mathbb{E}((\Delta M_k)^2)<\infty} \). Moreover \( {{(M_n)}_{n\geq0}} \) is a Cauchy sequence in \( {\mathrm{L}^2} \), and thus it converges in \( {\mathrm{L}^2} \) to some \( {M_\infty\in\mathrm{L^2}} \). It remains to establish almost sure convergence. It suffices to show that \( {{(M_n)}_{n\geq0}} \) is almost surely a Cauchy sequence. Let us define

\[ X_n=\sup_{i,j\geq n}|M_i-M_j|. \]

Now it suffices to show that almost surely \( {\lim_{n\rightarrow\infty}X_n=0} \). Actually \( {0\leq X_n\searrow X_\infty} \) as \( {n\rightarrow\infty} \) almost surely where \( {X_\infty\geq0} \). Hence it suffice to show that \( {\mathbb{E}(X_\infty^2)=0} \) where the square is for computational convenience later on. By monotone convergence it suffices to show that \( {\lim_{n\rightarrow\infty}\mathbb{E}(X_n^2)=0} \). We have \( {X_n\leq 2Y_n} \) where

\[ Y_n=\sup_{k\geq n}|M_k-M_n|. \]

It suffices to show that \( {\lim_{n\rightarrow\infty}\mathbb{E}(Y_n^2)=0} \). But the Doob maximal inequality for the martingale \( {{(M_{n+k}-M_n)}_{k\geq 0}} \) gives

\[ \mathbb{E}(Y_n^2) \leq 4\sup_{k\geq n}\mathbb{E}((M_k-M_n)^2) =4\sum_{k=n+1}^\infty\mathbb{E}((\Delta M_k)^2), \]

and we already know that the right hand side is the reminder of a converging series!

Finally note that both the limit in \( {\mathrm{L}^2} \) and the almost sure limit are the same either by using uniform integrability and using the uniqueness of the limit in \( {\mathrm{L}^2} \) or by extracting an almost sure subsequence from the \( {\mathrm{L}^2} \) convergence and using the uniqueness of the almost sure limit.

Doob maximal inequalities.

If \( {{(X_n)}_{n\geq0}} \) is a nonnegative submartingale then for all \( {n\geq0} \) and all \( {r>0} \),

\[ \mathbb{P}(\max_{0\leq k\leq n}X_k\geq r)\leq\frac{\mathbb{E}(X_n)}{r} \]

If \( {{(M_n)}_{n\geq0}} \) is a martingale then for all \( {n\geq0} \) and and all \( {p>1} \),

\[ \mathbb{E}\left(\sup_{0\leq k\leq n}|M_k|^p\right) \leq\left(\frac{p}{p-1}\right)^p\mathbb{E}(|M_n|^p) \]

in particular by monotone convergence we get

\[ \mathbb{E}\left(\sup_{n\geq0}|M_n|^p\right) \leq\left(\frac{p}{p-1}\right)^p\sup_{n\geq0}\mathbb{E}(|M_n|^p). \]

Note that \( {q=p/(p-1)} \) is the Hölder conjugate of \( {p} \). For \( {p=2} \) then \( {(p/(p-1))^p=4} \).

Proof. For the first inequality, we set \( {T=\inf\{n\geq0:X_n\geq r\}} \). For all \( {k\leq n} \), we have \( {\{T=k\}=\{X_0<r,\ldots,X_{k-1}<r,X_k\geq r\}\in\mathcal{F}_k} \). Also

\[ r\mathbf{1}_{T=k} \leq X_k\mathrm{1}_{T=k} \leq \mathbb{E}(X_n\mid\mathcal{F}_k)\mathbf{1}_{T=k} =\mathbb{E}(X_n\mathbf{1}_{T=k}\mid\mathcal{F}_k) \]

hence

\[ r\mathbb{P}(T=k)\leq\mathbb{E}(X_n\mathbf{1}_{T=k}) \]

and summing over all \( {k\leq n} \) gives

\[ r\mathbb{P}(T\leq n)\leq\mathbb{E}(X_n\mathbf{1}_{T\leq n}). \]

It remains to note that \( {\{\max_{0\leq k\leq n}X_k\geq r\}=\{T\leq n\}} \) to get the first inequality.

For the second inequality, we use the proof of the first part with the nonnegative submartingale \( {{(|M_n|)}_{n\geq0}} \). This gives, for all \( {r>0} \), denoting \( {S_n=\max_{0\leq k\leq n}|M_k|} \),

\[ r\mathbb{P}(S_n\geq r)\leq\mathbb{E}(|M_n|\mathbf{1}_{S_n\geq a}). \]

Now

\[ \int_0^\infty r\mathbb{P}(S_n\geq r)pr^{p-2}\mathrm{d}r \leq\int_0^\infty\mathbb{E}(|M_n|\mathbf{1}_{S_n\geq r})pr^{p-2}\mathrm{d}r. \]

Now by the Fubini–Tonelli theorem, this rewrites

\[ \mathbb{E}\int_0^\infty r\mathbf{1}_{S_n\geq r}pr^{p-2}\mathrm{d}r \leq\mathbb{E}\int_0^\infty|M_n|\mathbf{1}_{S_n\geq r}pr^{p-2}\mathrm{d}r \]

namely

\[ \mathbb{E}\int_0^{S_n} pr^{p-1}\mathrm{d}r \leq\frac{p}{p-1}\mathbb{E}\int_0^{S_n}|M_n|(p-1)r^{p-2}\mathrm{d}r \]

in other words

\[ \mathbb{E}(S_n^p)\leq\frac{p}{p-1}\mathbb{E}(|M_n|S_n^{p-1}). \]

The right hand side is bounded by the Hölder inequality as

\[ \mathbb{E}(|M_n|S_n^{p-1}) \leq\mathbb{E}(|M_n|^p)^{1/p}\mathbb{E}(S_n^p)^{1-1/p}, \]

hence

\[ \mathbb{E}(S_n^p)\leq\left(\frac{p}{p-1}\right)^p\mathbb{E}(|M_n|^p). \]

Doob decomposition. If \( {{(X_n)}_{n\geq0}} \) is adapted, and integrable in the sense that \( {\mathbb{E}(|X_n|)<\infty} \) for all \( {n} \), then there exists a martingale \( {M} \) and a predictable process \( {A} \) such that

\[ M_0=A_0=0\quad\text{and}\quad X=X_0+M+A. \]

Moreover this decomposition is unique. Furthermore if \( {X} \) is a submartingale then \( {A} \) is nondecreasing and there exists \( {A_\infty} \) taking values in \( {[0,+\infty]} \) such that almost surely

\[ 0\leq A_n\underset{n\rightarrow\infty}{\nearrow} A_\infty. \]

Recall that predictable means that \( {A_n} \) is \( {\mathcal{F}_{n-1}} \) measurable for all \( {n\geq1} \).

The process \( {A} \) is the compensator of \( {X} \) in the sense that \( {X-A} \) is a martingale. For a martingale \( {N} \), the compensator of the submartingale \( {X=N^2} \) is the increasing process of \( {N} \).

There is a continuous time analogue known as the Doob–Meyer decomposition.

Proof. Note that \( {A} \) is necessarily integrable too. For the uniqueness, if \( {X=X_0+M+A} \) then

\[ \mathbb{E}(X_{n+1}-X_n\mid\mathcal{F}_n)=A_{n+1}-A_n, \]

and since \( {A_0=0} \) we get, for all \( {n\geq1} \),

\[ A_n=\sum_{k=0}^{n-1}\mathbb{E}(X_{k+1}-X_k\mid\mathcal{F}_k), \]

and \( {M_n=X_n-X_0-A_n} \). For the existence, we set \( {A_0=M_0=0} \) and we use the formulas above to define \( {A_n} \) and \( {M_n} \) for all \( {n\geq1} \). Since \( {X} \) is adapted, \( {A_{n+1}} \) and \( {M_n} \) are \( {\mathcal{F}_n} \) measurable. By definition \( {A_n} \) is integrable and since \( {X_n} \) is integrable we also have that \( {M_n} \) is integrable. Moreover \( {\mathbb{E}(M_{n+1}-M_n\mid\mathcal{F}_n)=0} \) because

\[ M_{n+1}-M_n =X_{n+1}-X_n-(A_{n+1}-A_n) =X_{n+1}-X_n-\mathbb{E}(X_{n+1}-X_n\mid\mathcal{F}_n). \]

Finally, when \( {X} \) is a submartingale then for all \( {n\geq0} \) we have

\[ \begin{array}{rcl} A_{n+1}-A_n &=&\mathbb{E}(A_{n+1}-A_n\mid\mathcal{F}_n)\\ &=&\mathbb{E}(X_{n+1}-X_n\mid\mathcal{F}_n) -\mathbb{E}(M_{n+1}-M_n\mid\mathcal{F}_n)\\ &=&\mathbb{E}(X_{n+1}-X_n\mid\mathcal{F}_n) \geq0. \end{array} \]

Curiosity. In the special case of nonnegative martingales bounded in \( {L\log L} \), there is an information theoretic argument due to Andrew R. Barron that resembles a little bit to the one that we have used for continuous martingales in a previous post. This is written in an apparently unpublished document available online.

Thanks. This post benefited from discussions with Nicolas Fournier and Nathaël Gozlan.

Last Updated on 2020-10-10

Leave a Comment
Syntax · Style · .