Loading [MathJax]/jax/output/CommonHTML/jax.js
Press "Enter" to skip to content

Back to basics - Irreducible Markov kernels

Markov chain

Let (Xn)n0 be a Markov chain on an at most countable state space E, with transition kernel P. For any x,yE, the number of passages in y before returning to x is given by

μx(y)=Ex(Tx1n=01Xn=y)whereTx=inf{n>0:Xn=x}.

Note that μx(x)=1 and μx(E)=E(Tx).

I like very much the following classical theorem about irreducible kernels:

Irreducible kernels. Suppose that P is irreducible. Then every invariant measure μ has full support and satisfies μ(y)μ(x)μx(y) for any x and y in E. Moreover there are three cases:

  1. P is transient. Then μ(E)= for every invariant measure μ, E is infinite, and there is no invariant probability measure;
  2. P is recurrent. Then μx is invariant for every xE; the invariant measures are all proportional; and if μ is invariant then μ(y)=μ(x)μx(y) for any x,yE. In particular μx(y)μy(x)=1. There are two sub-cases:
    1. P is positive recurrent. Then there exists a unique invariant probability measure μ given by μ(x)=1/Ex(Tx) for every xE;
    2. P is null recurrent. Then μ(E)= for every invariant measure μ, E is infinite, and there is no invariant probability measure.

Here are my three favorite examples with infinite countable state space:

Random walk on Z. The transition kernel is given by P(x,x+1)=p and P(x,x1)=1p for every xZ, with p(0,1). The kernel is irreducible. A measure μ on Z is invariant for P iff for every xZ,

μ(x)=yZμ(y)P(y,x)=pμ(x1)+(1p)μ(x+1).

This gives that all invariant measures take the form

μ(x)=a+b(p1p)x,xZ,

where a,b0 are parameters such that μ(0)=a+b>0. We recover the counting measure if b=0 and a sort of geometric measure when a=0 which is reversible. If p=1/2 then both terms are the same and the counting measure is reversible.

The chain never admits an invariant probability measure. If p=1/2 then the chain is null recurrent, all invariant measures are proportional. If p1/2 then the chain is transient, there is a two parameters family of invariant measures.

Random walk on N. The transition kernel is P(x,x+1)=p for x{0,1,} and P(x,x1)=1p for x{1,2,} and P(0,0)=1p. Here again p(0,1). The kernel is irreducible. All invariant measures are geometric, reversible, and take the form

μ(x)=μ(0)(p1p)x,x0.

If p<1/2 then the chain is positive recurrent and there is a unique invariant probability measure which is the geometric law on {0,1,} with parameter p/(1p). If p=1/2 then the chain is null recurrent and the invariant measures are all proportional to the counting measure. If p>1/2 then the chain is transient and all invariant measures are proportional to a geometric blowing measure which cannot be normalized into a probability measure.

The continuous time analogue of this chain is known as the M/M/1 queue.

We could consider instead the reflected random walk on N, by taking P(0,1)=1. This deformation at the origin has a price: the invariant distribution in the case p<1/2 is no longer geometric, it is a mixture between a Dirac mass at 0 and the geometric law on {1,2,}.

Growth and collapse walk on N. The transition kernel is given by P(x,x+1)=px and P(x,0)=1px, for every xN, where p0=1, p1,p2,[0,1]. The kernel is irreducible iff

{xN:px>0}=NandCard{xN:px<1}=.

The very special growth-collapse nature of P gives

P0(T0>n)=nx=0P(x,x+1)=p0p1pn.

Also, if the kernel is irreducible, then it is recurrent iff

limxp0p1px=0.

If μ is invariant then for any xN with x>0,

μ(x)=yNμ(y)P(y,x)=μ(x1)px1=μ(0)p0p1px1.

It follows that if the kernel is irreducible, then it is positive recurrent iff

x=0p0p1px<,

and in this case there exists a unique invariant probability measure given by

μ(x)=1Ex(Tx),xN.

Note that if p=xNpx(0,) and if μ is invariant then μ(x)pμ(0), and since μ(0)=xN(1px)μ(x), we get μ(0)pμ(0)xN(1px), which forces either μ(0)=0 or μ(0)=, and therefore there is no invariant measure in this case.

Further reading. Markov chains, by James Norris.

One Comment

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Syntax · Style · .