This post provides the solution to a tiny exercise of probability theory, answering the question asked by a student during the MAP-432 class yesterday. Let (Ω,F,P) be a probability space equipped with a filtration (Fn)n≥0. Recall that a random variable τ taking values in N={0,1,…} is a stopping time when {τ=n}∈Fn for any n∈N. We define the σ-field Fτ:={A∈F:∀n,{τ=n}∩A∈Fn}. Remarkably, for any integrable random variable X and for any n∈N,
E(X|Fτ)1{τ=n}=E(X|Fn)1{τ=n}.
To see it, for every A∈Fn, since A∩{τ=n}∈Fτ and {τ=n}∈Fτ, we get
E(1AX1{τ=n})=E(1A∩{τ=n}E(X|Fτ))=E(1AE(X1{τ=n}|Fτ)),
and therefore E(X1{τ=n}|Fn)=E(X1{τ=n}|Fτ). Now {τ=n}∈Fn∩Fτ.
A nice property. If τ and θ are stopping times then τ∧θ:=min(τ,θ) is a stopping time, and moreover for every integrable random variable X,
E(E(X|Fθ)|Fτ)=E(E(X|Fτ)|Fθ)=E(X|Fτ∧θ).
Note that if both τ and θ are deterministic (i.e. constant) then we recover the usual tower property of conditional expectations. Note also that the events {τ≤θ}, {τ=θ}, and {τ≥θ} all belong to Fτ∩Fθ, and that Fτ∧θ⊂Fτ∩Fθ.
Proof. The fact that τ∧θ is a stopping time is left to the reader. To prove the property on conditional expectations, it suffices by symmetry to prove the second equality, i.e. that for every A∈Fθ, by denoting Y:=E(X|Fτ∧θ),
E(Y1A)=E(E(X|Fτ)1A).
We start by observing that Y1{τ≤θ}=E(X|Fτ)1{τ≤θ} which gives immediately
E(Y1A1{τ≤θ})=E(E(X|Fτ)1A1{τ≤θ}).
On the other hand, we have
E(Y1A1{τ>θ})=∞∑k=0E(E(X|Fk)1A∩{τ>k}∩{θ=k}).
Now, since Bk:=A∩{τ>k}∩{θ=k}∈Fk∩Fτ,
E(E(X|Fk)1Bk)=E(X1Bk)=E(E(X|Fτ)1Bk)
which gives, by summing over k,
E(Y1A1{τ>θ})=E(E(X|Fτ)1A1{τ>θ}).