This term, some members of the probability group have been reading Probability and Real Trees, by Steve Evans based on his Saint-Flour course in 2005. A pdf can be found here. This morning was my turn to present, and I gave a precis of Chapter 4, which concerns metrics on metric spaces, a family of tools which will be essential for later chapters which discuss convergence of trees, viewed as metric objects. **Hausdorff Distance** We start by considering a metric on subsets of a given base space X. The *Hausdorff distance *between two sets A, B is defined as

where consists of set A, and all the points within r of set A. So the Hausdorff distance measures how much we have to fatten each set before it contains the other. Note that if we have a giant set next to a tiny one, we will have to fatten the tiny one a great deal more to achieve this. Sometimes it will be more helpful to think of the following alternative characterisation

In words, we measure how far away is the point in A farthest from B, and vice versa, and take the larger of the two. The presence of the sups and infs indicates that the inclusion or otherwise of the boundaries of A,B does not affect this distance. In particular, this means that , and so to allow us to call Hausdorff distance a metric, we restrict attention to the closed subsets of X, M(X).

We also observe immediately that if X does not have bounded diameter, then it is possible for the Hausdorff distance between two sets to be infinite. We won’t worry about that for now, because we will mainly be considering host spaces which are compact, for which the following results will be useful.

**Lemma 4.4 **– If (X,d) is complete, then (M(X),d_H) is also complete.

*Proof: *Assume we have a sequence of closed sets which have the Cauchy property under d_H. I’m going to do this slightly differently to Evans, because it’s not the case you can immediately choose for each n, such that for all m,n. For an explicit counterexample, see comment from Ed Crane below the article.

Note that if a subsequence of a Cauchy sequence converges to a limit, then the whole sequence converges to that same limit. So we can WLOG replace by some subsequence such that . Now it is clear that for any , there is a choice of such that (*). Starting from arbitrary , we can construct in this manner a sequence that is Cauchy, and thus has a limit point .

Let be the set of sequences for some m, with , satisfying (*). Now let S be the closure of the set of limit points of this family of sequences, which we have shown is non-empty.

Then for any n, and any , we can construct such a sequence, and its limit point x, and the triangle inequality down the path gives . Furthermore, by construction , hence it follows that .

**Lemma 4.5 **– Given (X,d) compact, (M(X),d_H) is also compact.

*Sketch proof:* We are working with metric spaces, for which the existence of a finite -net for every is equivalent to compactness. [An -net is a set S of points in the space such that every point of the space lies within of an element of S. Thinking another way, it is the set of centres of the balls in a finite covering of the space by -balls.] It is not too hard to check that if is an -net for (X,d), then is finite, and an -net for (M(X),d_H).

**Gromov-Hausdorff Distance**

So far this is fine, but won’t be useful by itself for comparing how similar two trees are as metric spaces, because we can’t be sure a priori that we can embed them in a common host space. To resolve this, we consider instead the *Gromov-Hausdorff distance*, which will serve as a distance between metric spaces, even when they are not canonically defined as subsets of a common metric space.

Given X, Y metric spaces, we define

In words, the Gromov-Hausdorff distance between two metric spaces examines the ways to embed them isometrically into a common larger metric space, and gives the minimal Hausdorff distance between them under the class of such embeddings. One issue is that the collection of all metric spaces is not a set. For example, given any set, we can define a metric via the discrete metric, so the collection of metric spaces is at least as large as the collection of all sets, which is not a set. Fortunately, all is not broken, as when we consider a general metric space Z in which we might embed copies of X and Y we are wasting lots of the perhaps very complicated space, because we only need to compare the subsets which are isometric copies of X and Y. So in fact, we lose nothing if we assume that Z is a disjoint union of copies of X and Y, with a metric chosen appropriately. So

In practice though, this is difficult to compute, since the set of things we have to minimize over is complicated. It turns out we can find an equivalent characterisation which will be easier to use in a number of examples, including the case of real trees which is the whole point of the course.

**Correspondence and Distortion**

We define a *correspondence* from X to Y to be

where are the canonical projection maps from into respectively. So we can think of a correspondence as being something like a matching from X to Y. In a matching, we insist that the projection maps into each set are injections, ie that each element of X (resp Y) can appear in at most one pair, whereas for a correspondence, we demand that the projection maps are surjections, ie that each element of X appears in at least one pair.

Then the *distortion* of a correspondence

In words, if two sets are non-isomorphic, then a correspondence can’t describe an isometry between the sets, and the distortion is a measure of how far from being an isometry the correspondence is. That is, given a pair of pairs in the correspondence, for an isometry, the distance between the X-elements would be equal to the distance between the Y-elements, and the distortion measures the largest discrepancy between such pairs of pairs over the correspondence.

**Theorem 4.11 **– , where the infimum is taken over all correspondences X to Y.

*Remark: *The RHS of this equivalence can be thought of as the set coupling between X and Y such that the pairs have as equal distances as possible.

*Proof:* Given an embedding into with , we have with , by taking:

From the definition of Hausdorff distance, it follows that the for every x, there is a y with d(x,y)<r, and hence the appropriate projection maps are projections.

So it remains to prove that . We can define a metric on by

Then for any , there is , and thus , and vice versa hence .

It only remains to check that this is actually a metric. Let’s take , and so

so taking outside the brackets gives one form of the triangle inequality. We have to check the ‘other combination’ of the triangle inequality. We assume that the infima for are attained at respectively.

But we also have from the definition of distortion, and so adding these gives the triangle inequality we want, and completes the proof of this theorem.corc