Struggles with probability topology: the Moment-Generating-Function topology

I have been struggling for a few months with topologies on probability distributions. I started when I realized that the common topologies (weak topology and total-variation topology) are extremely weak: they do not imply convergence of any moments. I then set out to understand this weakness better, and find better topologies.

Today, I will discuss the MGF (moment generating function) topology, and see whether it’s satisfying.

The MGF topology

Let’s first introduce how this topology works. Let’s first recall exactly what the MGF is. For a random variable X, the MGF is the function:

\displaystyle t \rightarrow E( \exp(t X) )

which, if you know your analysis, you will recognize as the Laplace transform. We’ll note M_X(t) the MGF.

The MGF is convex. Actually, every pair derivative of the MGF is positive so I like to call it “hyper-convex”. And further, even the log of the MGF (which is the Cumulant Generating Function CGF) is hyper-convex. To sum up, the MGF is extremely regular.

The MGF topology is simple to define. A sequence of random variables X_n is said to converge iff there exists a compact neighborhood of 0 such that the sequence of the MGFs converge pointwise to a limit MGF. In math:

\displaystyle \forall |t|\leq r, M_{X_n}(t) \rightarrow M_X(t)


MGF convergence is stronger than weak convergence (though, so far, I haven’t found any proof of that fact). Does it imply some convergence for the moments ? It does !!

MGF convergence implies convergence of all moments. In fact, it implies convergence of all statistics which grow slower than \exp( |r X| ) where r is the end of the convergence region.

Finally, we have a convergence that I feel justified in calling “strong”: it implies convergence of a large ensemble of key properties of random variables. However, even in this case, some funky things can happen. Indeed, MGF convergence doesn’t imply convergence of all statistics: it only implies convergence of all statistics that grow slower than a given exponential function. It’s easy to build counter-examples where there is MGF convergence with some statistics failing to have the appropriate limits.

Also, note an interesting fact: two probability distribution can have widely different density functions but almost identical cumulant generating functions (see this reference). McCullagh finds that very surprising, but it isn’t: the MGF is sensitive to moments, and not directly to the density, and almost identical moments can be obtained from very different densities.


The MGF topology isn’t perfect by any means, but it is considerably stronger than the weak topology in that it implies convergence of all moments. It seems like a good tool to have in your arsenal, especially since people rarely talk about it.

As always, feel free to correct any inaccuracies, errors, spelling mistakes and to send comments on my email ! I’ll be glad to hear from you.