Random Permutations (Part 10)

December 18, 2019

Random Permutations (Part 10)

John Baez

Groupoids generalize sets in an obvious way: while elements of a set can only be the same or different, objects of a groupoid can be 'the same' — that is, isomorphic — in more than one way. Groupoids also let us generalize the concept of cardinality. While the cardinality of a finite set is a natural number, the cardinality of a finite groupoid can be any nonnegative rational number!

This suggests that probabilities in combinatorics could secretly be cardinalities of groupoids. Indeed, Poisson distributions naturally show up this way. For a good self-contained introduction, see:

Now I want to explain how groupoid cardinality can be used to prove some important (and well-known) facts about random permutations.

A while back we saw that many questions about random permutations could be answered using just this one fact: the expected number of cycles of length \(k\) in a random permutation of an \(n\)-element set is \(1/k\), at least if \(1 \le k \le n\).

Last time we proved a stronger fact. To state this, we treat the number of cycles of length \(k\) in a permutation of an \(n\)-element set as a random variable, say \(C_{n,k}\). Then we have this: the first \(n\) moments of \(C_{n,k}\) match those of a Poisson distribution with mean \(1/k\).

We actually did more: we described all the moments! We did this by working with 'factorial moments'. The \(p\)th factorial moment of a random variable \(x\) is the mean of

\[ x^{\underline{p}} = x(x-1)(x-2) \, \cdots \, (x-p+1) \]

Then we have this: the first \(n\) factorial moments of \(C_{n,k}\) match those of the Poisson distribution of mean \(1/k\), while the rest vanish. Or if you prefer to leave Poisson distributions out of it:

\[ E(C_{n,k}^{\underline{p}}) = \left\{ \begin{array}{ccl} \displaystyle{\frac{1}{k^p}} & \mathrm{if} & p k \le n \\ \\ 0 & \mathrm{if} & p k \gt n \end{array} \right. \]

The proof of this was a fairly easy calculation, which however involved some miraculous cancellations and provided no compelling story about why the answer turns out to be so simple.

I now think I think I've found that story. It doesn't connect to Poisson distributions — not yet, anyway. But it does explain the meaning of the answer \(1/k^p\). It's the cardinality of a groupoid.

A quick sketch

The idea is quite simple when you know the underlying concepts. We'll show that \(E(C_{n,k}^{\underline{p}})\) is the cardinality of a certain groupoid. When \(n \lt k p\) this groupoid has no objects, so its cardinality is zero. Otherwise, this groupoid is equivalent to a product of a groupoid with cardinality \(1\) and \(p\) copies of the group \(\mathbb{Z}/k\), thought of as a one-object groupoid. Thus

\[ E(C_{n,k}^{\underline{p}}) = \frac{1}{k^p} \]

The nice thing is that each copy of \(\mathbb{Z}/k\) is naturally associated to a cycle of length \(k\) in our random permutation. It's no coincidence that \(\mathbb{Z}/k\) is a 'cyclic group'.

What is this groupoid, whose cardinality is \(E(C_{n,k}^{\underline{p}})\)? Its objects are \(n\)-element sets equipped with a permutation that in turn is equipped with an ordered \(p\)-tuple of disjoint cycles of length \(k\). There's an obvious concept of isomorphism between two such gadgets, so we get a groupoid of them. And then we just think about this groupoid a bit, and everything follows.

But let me slow down and start from the beginning.

Review of groupoid cardinality

To compute the cardinality of a groupoid \(X\), we go through its isomorphism classes of objects, pick one representative \(x\) of each class, count the morphisms from \(x\) to itself, take the reciprocal... and add up all these reciprocals. In a formula:

\[ |X| = \sum_{[x] \in \underline{X}} \frac{1}{|\mathrm{Aut}(x)|} \]

Here \(\underline{X}\) is the set of isomorphism classes of \(X\), and \(\mathrm{Aut}(x)\) is the automorphism group of \(x\), an arbitrary representative of some isomorphism class.

To make sure the sum converges, we can play it safe and assume our groupoid is finite. We don't need to do this: for example, for the groupoid of finite sets the sum converges to \(e\). But today I'll only work with groupoids that are either finite or equivalent to one that's finite. (Equivalent groupoids clearly have the same cardinality.)

One way to get a groupoid is by taking a set \(X\) with a group \(G\) acting on it. Then we get a groupoid \(X/\!/G\) where an object is an element \(x \in X\) and a morphism from \(x\) to \(x'\) is a group element \(g \in G\) with \(gx = x'\). We compose these morphisms in the obvious way: by multiplication in \(G\).

We call \(X/\!/G\) the weak quotient of \(X\) by \(G\). In the ordinary quotient \(X/G\), we declare \(x\) and \(x'\) equal when there's a group element \(g\) with \(gx = x'\). In the weak quotient, we merely put in an isomorphism between \(x\) and \(x'\). Thus, the weak quotient retains more information.

Here's one of my favorite equations:

\[ |X/\!/G| = |X|/|G| \]

It says that to understand division, we can use groupoids. If you take a 5-element set and mod out by a 2-element group, you don't get a set with \(5/2\) elements. But if you take the weak quotient, you get a groupoid with cardinality \(5/2\).

Groupoid cardinality and probability theory

If you have a group \(G\), how can you find a set that it acts on? One way is to use \(G\) itself. \(G\) acts on itself in three interesting ways: by left translations, by right translations, and by conjugation. No matter which method you choose, you'll get

\[ |G/\!/G| = 1 \]

If \(G\) acts on itself by left or right translation, every object of \(G/\!/G\) is isomorphic to every other object in a unique way. Thus, \(G/\!/G\) is equivalent to the groupoid with one object and one morphism. Groupoid cardinality is invariant under equivalence. So it's no surprise that \(|G/\!/G| = 1\) in this case.

It's more interesting when \(G\) acts on itself by conjugation. Then the weak quotient \(G/\!/G\) has many different isomorphism classes, one for each conjugacy class of \(G\). The fact that \(|G/\!/G|\) equals \(1\) then says this: if we run through the conjugacy classes of \(G\), pick a representative of each one, take the reciprocal of the cardinality of its centralizer, and sum up these reciprocals, we get \(1\).

But the fun really starts when we consider a subset \(X \subseteq G\) that's invariant under conjugation. The probability that a random element of \(G\) lies in \(X\) is \(|X|/|G|\). But this equals \(|X/\!/G|\). So, we can compute probabilities using groupoid cardinality!

Here, of course, we are treating \(G\) as a probability measure space where each element has measure \(1/|G|\).

More generally, suppose \(\pi \colon X \to G\) is a map of \(G\)-sets, where \(G\) acts on itself by conjugation. Then we get a function \(f \colon G \to \mathbb{N}\) that counts the points of \(X\) that map to each element of \(G\):

\[ f(g) = |\pi^{-1}(g)| \]

We can think of \(f\) as a random variable, since it's a function on a probability measure space, namely \(G\). The expected value of \(f\) is then the groupoid cardinality of \(X/\!/G\). Here's why:

\[ \begin{array}{ccl} E(f) &=& \displaystyle{ \frac{1}{|G|} \sum_{g \in G} f(g) } \\ \\ &=& \displaystyle{ \frac{1}{|G|} \sum_{g \in G} |\pi^{-1}(g)| } \\ \\ &=& |X|/|G| \\ \\ &=& |X/\!/G| \end{array} \]

This may seem like an impractical way to compute expected values. But it really works wonders in some cases, namely when \(X/\!/G\) is equivalent to a groupoid whose cardinality we can compute by other means.

And that's what we'll see now.

The calculation

The key fact is this: since the function

\[ C_{n,k} \colon S_n \to \mathbb{N} \]

tells us the number of cycles of length \(k\) in a permutation of an \(n\)-element set,

\[ C_{n,k}^{\underline{p}} = C_k (C_k-1)(C_k-2) \, \cdots \, (C_k-p+1) \]

tells us the number of ordered \(p\)-tuples of distinct cycles of this sort. And since distinct cycles are the same as disjoint cycles, we can replace the word 'distinct' with 'disjoint' here — and we'll do that from now on.

The expected value \(E(C_{n,k}^{\underline{p}})\) is thus the number of ordered \(p\)-tuples of disjoint cycles of length \(k\) of a permutation, summed over all permutations, divided by \(n!\).

But this is the cardinality of a groupoid!

Let \(X\) be the set whose elements are of this form: a permutation \(\sigma \in S_n\) together with an ordered \(p\)-tuple \((c_1, \dots, c_p)\) of disjoint cycles of length \(k\). The group \(S_n\) acts on \(X\) in obvious way: it acts by conjugation on itself, and the cycles go along for the ride. There's a map

\[ \pi \colon X \to S_n \]

that forgets the cycles and only keeps the permutation \(\sigma\). This is clearly a map of \(S_n\)-sets. So, we can use the results described in the last section!

We get a function \(f \colon S_n \to \mathbb{N}\) that counts the points of \(X\) that map to each element of \(S_n\):

\[ f(\sigma) = |\pi^{-1}(\sigma)| \]

And this function is just what I'm calling \(C_{n,k}^{\underline{p}}\). For each permutation \(\sigma\), it gives the number of ordered \(p\)-tuples of distinct cycles of length \(k\).

It follows that the expected value we're trying to compute is a groupoid cardinality:

\[ E(C_{n,k}^{\underline{p}}) = |X/\!/S_n| \]

But this groupoid \(X/\!/S_n\) is equivalent to one I find more charismatic: the groupoid \(\mathrm{Cyc}(n,k,p)\) of \(n\)-element sets equipped with a permutation that is itself equipped with a chosen \(p\)-tuple of disjoint cycles of length \(k\). The morphisms in this groupoid are the obvious ones. (At this point I'm speeding up, assuming you can think like a category theorist.)

In other words, instead of fixing our 'favorite' \(n\)-element set, which is sort of ridiculous, we use all of them.

Now, it's easy to see that an \(n\)-element set equipped with this structure is the same as a \(p\)-tuple of cyclically ordered \(k\)-element sets together with the 'remainder': a set with \(n - p k\) elements equipped with an arbitrary permutation. Thus, we have

\[ \mathrm{Cyc}(n,k,p) \simeq \mathrm{Cyc}(k)^p \times \mathrm{Perm}(n-k p) \]

Here \(\mathrm{Cyc}(k)\) is the groupoid of cyclically ordered \(k\)-element sets, and \(\mathrm{Perm}(n - k p)\) is the groupoid of sets with \(n - k p\) elements equipped with a permutation.

But \(\mathrm{Cyc}(k)\) is equivalent to the groupoid \(B(\mathbb{Z}/k)\), which has one object and the cyclic group \(\mathbb{Z}/k\) as automorphisms. That's because all objects of \(\mathrm{Cyc}(k)\) are isomorphic, and the symmetry group of a cyclically ordered \(k\)-element set is \(\mathbb{Z}/k\). An easy calculation shows that

\[ |B(\mathbb{Z}/k)| = \frac{1}{k} \]

On the other hand,

\[ \mathrm{Perm}(n-k p) \simeq S_{n-k p} /\!/S_{n - k p} \]

where the symmetric group is acting on itself by conjugation. So, we get

\[ |\mathrm{Perm}(n-k p)| = 1 \]

We thus get

\[ \begin{array}{ccl} E(C_{n,k}^{\underline{p}}) &=& |\mathrm{Cyc}(n,k,p)| \\ \\ &=& |\mathrm{Cyc}(k)^p \times \mathrm{Perm}(n-k p)| \\ \\ &=& |\mathrm{Cyc}(k)|^p \cdot | \mathrm{Perm}(n-k p)| \\ \\ &=& (1/k)^p \cdot 1 \\ \\ &=& \displaystyle{ \frac{1}{k^p} } \end{array} \]

Voilà!

Afterthoughts

Experts in category theory will note that I'm vacillating between using various large groupoids, like \(\mathrm{Perm}(n-k p)\), and various finite groupoids that are equivalent to these, like \(S_{n-k p} /\!/S_{n - k p}\). In my own mind, I barely notice the distinction — I flit back and forth depending on what's more convenient at the moment. But when I try to explain what I'm doing, I feel some need to talk about this, and then everything becomes more lengthy. It's possible I could have given a more efficient treatment, but I wanted to show you a way of thinking.

I also introduced a bit more notation than strictly necessary. When you think of a group \(G\) as a one-object groupoid, people often call that groupoid \(B G\). The reason is that if we turn this groupoid into a space, it becomes a space known as the classifying space of \(G\), also denoted \(B G\). But the groupoid \(B G\) is also a weak quotient:

\[ B G \cong 1/\!/G \]

where \(G\) is acting on the \(1\)-element set in the only possible way. So, it should be no surprise that

\[ |B G| = \frac{1}{|G|} \]

Also, we have

\[ \mathrm{Cyc}(k) \cong \mathrm{Cyc}(k,k,1) \]

In other words, a \(k\)-element cyclically ordered set is the same as a \(k\)-element set equipped with a permutation having one chosen cycle of length \(k\).

Also, we have

\[ \mathrm{Perm}(m) \cong \mathrm{Cyc}(m,k,0) \]

In other words, an \(m\)-element set equipped with a permutation is the same as an \(m\)-element set equipped with a permutation having no chosen cycles of length \(k\). (It can have lots of cycles of length \(k\); we're just not choosing any of them.)

So, I didn't need to talk about \(B(\mathbb{Z}/k)\) or \(\mathrm{Cyc}(k)\) or \(\mathrm{Perm}(n - k p)\). But I thought that reducing notation to the absolute minimum would make things harder to understand, not easier.


You can also read comments on the n-Category Café, and make your own comments or ask questions there!


© 2019 John Baez
baez@math.removethis.ucr.andthis.edu
home