This post concludes my non-exhaustive list of things I think are interesting about the top-to-random shuffle. In previous posts I have talked about the construction and correct sense of convergence to randomness, and that this algorithm does genuinely achieve uniform randomness at some hitting time which is easy to specify. Promising that posts will be short hasn’t worked in the past so I won’t do that again now, but the idea of this post is brief:
When we specified the dynamics of the top-to-random shuffle, we insisted that the top card card could be placed anywhere in the deck with equal probability including back on top. This appears to be doing nothing except slowing down the shuffling process. Why is this important for convergence to randomness?
Fortunately the answer is short: if we do not let the top card be inserted back onto the top, allowing the configuration to stay the same, then we can divide up the set of orderings into two classes, and the pack will alternate between them.
Why is this a problem? Suppose the classes are called X and Y, and X is the class that contains the original ordering 1,2,…,n. Then after k shuffles, the ordering of the deck will be in X if k is even and in Y if k is odd. Remember our definition of ‘close to randomness’ will be the greatest difference in probability of an event between the actual distribution and the uniform distribution. As before, you can think of this by a betting analogy – what proportion profit can you make again someone who thinks it’s uniform by knowing the true distribution?
Well, it will turn out that the sets X and Y have the same size, so in the uniform distribution, the probability that an ordering is in X is 1/2. Whereas if the pack alternates, then so long as we know how many shuffles have occurred, this probability is either 0 or 1. In particular this is far from 1/2. We should remark that if we introduce the notion of sampling at a random time, or taking an average over all large times in some sense, such problems may disappear, but the result obtained may be less useful. See this post on Cesaro Mixing for details presented in a more rigorous style.
So it remains to see why this is true. First a definition. A transposition is when two elements in a permutation are exchanged. Eg 31452 -> 35412 by transposing 1 and 5. It makes sense intuitively that we can get from any permutation to any other permutation by making successive transpositions. Indeed, this is precisely what is happening in the top-to-random shuffle. To avoid continually having to write it out, we call the original permutation 1,2,…,n the identity permutation.
Then the idea is that X is the set of permutations we can obtain by starting with the identity and applying an even number of transpositions, while Y is the set obtained by applying an odd number of transpositions. For this to work, we will need to show that these sets are disjoint. That is, no permutation can be generated by both an odd number and an even number of transpositions. This is important, as a permutation can certainly be generated from transpositions in multiple ways. For example, if the elements are 1,2,3, we can obtain the permutation 2,1,3 by transposing 1 and 2, obviously. However, we could alternatively start by transposing 2 and 3 to get 1,3,2, then 1 and 3 to get 3,1,2, then 2 and 3 again to get 2,1,3. Note that both of these require an odd number of transpositions.
We will call a permutation even if it is generated by an even number of transpositions, and odd otherwise. We also say that its sign (alternatively signature, parity) is +1 or -1 respectively. To prove this is well-defined, we really want to find a different property that is easier to track.
A useful trick is to count how many pairs of elements are not in the correct order. Let’s do this for our previous example: 31452. There are 5 elements so 5 x 4 / 2 = 10 pairs of elements. We list them:
- 1 and 2 are in the correct order.
- 1 and 3 are not, as 3 comes before 1 in this permutation.
- 1 and 4 are correct.
- 1 and 5 are correct.
- 2 and 3 are not.
- 2 and 4 are not.
- 2 and 5 are not.
- 3 and 4 are correct.
- 3 and 5 are correct.
- 4 and 5 are not.
So 5 pairs are not in the correct order. Since 5 is odd, the claim is that this means 31452 is an odd permutation. To check this, and to confirm that the sign is well-defined, it suffices to check that the number of so-called inversions, or pairs in the wrong order, changes parity every time we apply a transposition.
This is clearly true if we transpose adjacent elements. Then the orderings of all pairs remain the same, apart from the pair we transposed, which changes. Then, if the elements are not adjacent, instead of transposing them directly, we can perform a succession of transpositions of adjacent elements. The easiest way to describe this is again by example. Suppose we want to transpose 3 and 5 in 31452.
31452 -> 13452 -> 14352 -> 14532 -> 15432 -> 51432.
Note that the middle transposition is actually transposing 3 and 5, and the others are symmetric about this middle operation. In particular, there is an odd number of transpositions in total. So we have proved the result for general transpositions, and thus we now know that the sign of a permutation is well-defined. Note also that there are an equal number of odd and even permutations of every n=>2. For every odd permutation, transposing 1 and 2 gives an even permutation, and vice versa, uniquely, giving a bijection.
What’s really going on is that we are able to multiply permutations, by doing one after the other. Unlike multiplying real numbers, the order in which we do this now matters. In this context, the set of permutations is an example of a general structure called a group. The idea of partitioning a group into subsets which are in some sense symmetric and where some other operation jumps between the subsets is a useful motivation point for a whole avenue of interesting theory. Not to be explored now unfortunately…
Related articles
- Solving Rubik’s cube from scratch, in python (fulmicoton.com)