We consider some of the theory of Weak Convergence from the Part III course Advanced Probability. It has previously been seen, or at least discussed, that characteristic functions uniquely determine the laws of random variables. We will show Levy‘s theorem, which equates weak convergence of random variables and pointwise convergence of characteristic functions.

We have to start with the most important theorem about weak convergence, which is essentially a version of Bolzano-Weierstrass for measures on a metric space M. We say that a sequence of measures is *tight* if for any , there exists a compact such that $\sup_n\mu(M\backslash K_\epsilon)\leq \epsilon$. Informally, each measure is concentrated compactly, and this property is uniform across all the measures. We can now state and prove a result of Prohorov:

**Theorem (Prohorov): **Let be a tight sequence of probability measures. Then there exists a subsequence and a probability measure such that .

*Summary of proof in the case : *By countability, we can use Bolzano-Weierstrass and a standard diagonal argument to find a subsequence such that the distribution functions

Then extend F to the whole real line by taking a downward rational limit, which ensures that F is cadlag. Convergence of the distribution functions then holds at all points of continuity of F by monotonicity and approximating by rationals from above. It only remains to check that , which follows from tightness. Specifically, monotonicity guarantees that F has countably many points of discontinuity, so can choose some large N such that both N and -N are points of continuity, and exploit that eventually

We can define the limit (Borel) measure from the distribution function by taking the obvious definition on intervals, then lifting to the Borel sigma-algebra by Caratheodory’s extension theorem.

**Theorem (Levy): ** random variables in . Then:

The direction is easy: is continuous and bounded.

In the other direction, we can in fact show a stronger constructive result. Precisely, if continuous at 0 with (*) and such that , then the characteristic function of some random variable *X *and . Note that the conditions (*) are the minimal such that could be a characteristic function.

We now proceed with the proof. We apply a lemma that is basically a calculation that we don’t repeat here.

where we apply that the integrand is dominated by 2. From the conditions on , this is for large enough *K*. This bound is of course also uniform in *n,* and so the random variables are tight. Prohorov then gives a convergent subsequence, and so a limit random variable *X *exists.

Suppose the whole sequence doesn’t converge to *X*. Then by Prohorov, there is a separate subsequence which converges to *Y* say, so by the direction of Levy already proved there is convergence of characteristic functions along this subsequence. But characteristic functions determine law, so *X=Y*, which is a contradiction.

###### Related articles

- Distribution function and weak convergence (maikolsolis.wordpress.com)

Pingback: Advanced Probability Revision Summary | Eventually Almost Everywhere

Hello,

I’m reading your proof, thank you for posting it. It is quite pretty isn’t it?

However, there is a point that still bothers me, perhaps you could help me.

Using the fact that \psi is continuous, we deduce the existence of a \delta such that if \norm(x) < \delta, then (1-\Re\psi(x) ) < \epsilon / 2C_d. I guessed you then take K great enough to have 1/K K) still gets arbitrarily small when K gets bigger.

Thank you for your help,

PB