Random Permutations (Part 8)

December 8, 2019

Random Permutations (Part 8)

John Baez

Last time we saw a strong connection between random permutations and Poisson distributions. Thinking about this pulled me into questions about partitions and Poisson distributions. These may be a bit of a distraction — but maybe not, since every permutation gives a partition, namely the partition into cycles.

Here's a good puzzle to start with:

Puzzle 11. Suppose raindrops are falling on your head, randomly and independently, at an average rate of one per minute. What's the average of the cube of the number of raindrops that fall on your head in one minute?

To get started we need a bit of probability theory. Suppose random events are happening at an average rate of one per minute, in a nice independent way so that the history so far doesn't affect the rate of new events now. Let \(P(k,t)\) be the probability that \(k\) events have happened between time \(0\) and time \(t\). Then:

\[ \frac{d}{d t} P(k,t) = P(k-1,t) - P(k,t) \]

since if \(k-1\) events have happened an extra event will bring the total to \(k\), while if \(k\) events have already happened an extra one will make the total stop being \(k\).

Now, it takes a bit of work to figure out \(P(k,t)\) given this equation and the obvious initial conditions

\[P(k,0) = \left\{\begin{array}{ccl} 1 &\mathrm{if}& k = 0 \\ 0 &\mathrm{if}& k \gt 0 \end{array}\right. \]

But if someone tells you the answer, you just need elementary calculus to check that it's right:

\[ P(k,t) = \frac{t^k}{k!} e^{-t} \]

So, let's be happy someone has already figured out the answer!

Back to our raindrop problem. Here the time \(t\) is 1 minute, so the probability of \(k\) raindrops landing on your head in this time is

\[ \frac{1}{k!} e^{-1} \]

Sanity check: these probabilities sum to 1.

We're trying to figure out the expected value of the cube of the number of raindrops that land on your head in a minute. We now know this is

\[ \frac{1}{e} \sum_{k=0}^\infty \frac{k^3}{k!} \]

So this is the sum we need to do. But being mathematicians, let's generalize. Let's figure out the expected value of the \(n\)th power of the number of raindrops that land on your head in a minute:

\[ \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!} \]

The sum here looks like a mutant version of

\[ \sum_{k=0}^\infty \frac{n^k}{k!} = e^n \]

But I mention this only so you don't get confused and make a wrong turn!

To compute

\[ \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!} \]

it's nice to express the powers \(k^n\) in terms of 'falling powers':

\[ k^{\underline{n}} = k (k-1) (k-2) \cdots (k - n + 1) \]

The power \(k^n\) counts the number of functions from \(n\) to \(k\), where now I'm identifying a natural number with a set having that number of elements. Similarly, the falling power \(k^{\underline{n}}\) counts the number of injections from \(n\) to \(k\). Why? With an injection, the first element of \(n\) has \(k\) choices of where to go, but the next one has only \(k-1\), and so on.

There's a nice formula expressing ordinary powers as linear combinations of falling powers:

\[ k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \; k^{\underline{i}} \]

Here \(\left\{ \begin{array}{c} n \\ i \end{array} \right\}\) is just the number of partitions of \(n\) into \(i\) parts.

This formula has a wonderfully slick proof. Any function \(f \colon n \to k\) factors as a surjection followed by an injection:

\[ n \stackrel{f}{\to} \mathrm{im} f \hookrightarrow k \]

and the surjection gives a partition of \(n\) into \(i\) parts, where \(i\) is the number of points in the image \(\mathrm{im} f\). Conversely a partition of \(n\) into \(i\) parts together with an injection \(i \hookrightarrow k\) determines a function \(f \colon n \to k\). So, the number of functions \(f \colon n \to k\) is the sum over \(0 \le i \le n\) of the number of partitions of \(n\) into \(i\) parts times the number of injections \(i \hookrightarrow k\). And that's all the formula says!

So, the expected value of the \(n\)th power of the number of raindrops that fall on your head is

\[ \frac{1}{e} \sum_{k=0}^\infty \frac{k^n}{k!} \; = \; \frac{1}{e} \sum_{k=0}^\infty \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!} \]

But in fact we can simplify this! Since there are no injections \(i \hookrightarrow k\) when \(i \gt k\), the falling power vanishes in this case, so our sum is just

\[ \frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!} \]

And now we can do the sum over \(k\) first. Since

$$ \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = 0}^\infty \frac{k(k-1)\cdots (k-i+1)}{k(k-1) \cdots 1} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e $$

we get

\[ \frac{1}{e} \sum_{i,k=0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, \frac{k^{\underline{i}}}{k!} \; = \; \sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} \]

which is the total number of partitions of an \(n\)-element set!

In our original question we were interested in \(n = 3\). There are 5 partitions of a 3-element set:

\[ \{\{1,2,3\}\}, \] \[ \{\{1,2\}, \{3\}\}, \; \{\{2,3\}, \{1\}\}, \; \{\{3,1\}, \{2\}\}, \] \[ \{\{1\}, \{2\}, \{3\}\} \]

So, the expected value of the cube of the number of raindrops that land on your head in a minute is 5.

Now that we're done let's reflect on what happened, and introduce more jargon so you can read more about the concepts involved.

The function

\[ P(k,t) = \frac{t^k}{k!} e^{-t} \]

is called a Poisson distribution. It's the answer whenever you want to know the probability of a given number \(k\) of events occurring in some amount of time \(t\) if these events occur with a constant average rate, set here to \(1\) for convenience, independently of the time since the last event.

The parameter \(t\) is the mean of the Poisson distribution. We worked out the \(n\)th moment of the Poisson distribution with mean \(1\). In other words, we worked out the expected value of \(k^n\) in this case:

\[ \sum_{k = 0}^\infty k^n P(k,1) \; = \; \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!} \]

We got the number of partitions of an \(n\)-element set, which is called the Bell number \(B_n\), after the author of a fantasy novel about a planet where all mathematics is done by men.

This result is called Dobiński's formula:

\[ \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!} = B_n \]

Dobiński published a paper about it in 1877, but it seems to me that he only proved some special cases:

The number of partitions of an \(n\)-element set into \(i\) parts, \(\left\{ \begin{array}{c} n \\ i \end{array} \right\} \), is called a Stirling number of the second kind. Given the definitions it's obvious that

\[ \sum_{i = 0}^\infty \left\{ \begin{array}{c} n \\ i \end{array} \right\} = B_n \]

(We can stop the sum at \(i = n\) without changing its value.)

The identity

\[ k^n = \sum_{i = 0}^k \left\{ \begin{array}{c} n \\ i \end{array} \right\} \, k^{\underline{i}} \]

is more interesting, since it connects Stirling numbers of the second kind to falling powers, which are also known as falling factorials, falling sequential products, and various other things. The proof I gave comes from here:

Category theorists will say it relies on the uniqueness of the epi-mono factorization of a function between finite sets. Rota doesn't use these terms, but he talks about the 'kernel' of a function, which is his name for the epi in this factorization.

In this paper Rota goes ahead and proves Dobiński's formula, but he does not explicitly discuss its connection to the Poisson distribution. Someone on Wikipedia translated his argument into the language of probability, which I like. Unfortunately the Wikipedia article merely reduces Dobiński's formula to the identity

\[ \frac{1}{e} \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = 1\]

and stops there, remarking that the \(k\)th factorial moment of the Poisson distribution of mean \(1\) equals \(1\). Factorial moments are like ordinary moments but with the power \(k^i\) replaced by the falling power \(k^{\underline{i}}\).

This identity is probably well known, but I was frustrated to see the argument fizzle out before it ended. Then I saw why

\[ \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e \]

It follows from some facts about species and groupoid cardinality.

If you have a species, namely a functor \(F\) from the groupoid of finite sets to the category of sets, you can think of it as describing a type of structure you can put on finite sets. For each natural number \(k\), \(F(k)\) is the set of structures of this type that you can put on a \(k\)-element set. The species has a generating function

\[ f(x) = \sum_{k = 0}^\infty \frac{|F(k)|}{k!} x^k \]

and the number \(f(1)\) has a nice meaning if the sum involved is well-defined. The category of elements \(\int F\) has finite sets equipped with the given structure as objects and bijections preserving the structure as morphisms. This category \(\int F\) is a groupoid, and the number \(f(1)\) is just the groupoid cardinality of \(\int F\).

Now fix a finite set \(i\). There's a species \(F\) that assigns to each finite set \(k\) the set of all injections \(i \hookrightarrow k\). The generating function of this species is

\[ f(x) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} x^k \]

Thus,

\[ f(1) = \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \]

is the cardinality of the groupoid \(\int F\) whose objects are finite sets equipped with an inclusion of the set \(i\). But this groupoid is equivalent to the groupoid of finite sets and bijections! Why? Because the morphisms in \(\int F\) are bijections that preserve the inclusion of the set \(i\). Thus, points in the image of this inclusion have no 'freedom of movement', and our morphism boils down to an arbitrary bijection between the remaining points.

Since \(\int F\) is equivalent to the groupoid of finite sets, it must have the same cardinality, namely

\[ \sum_{k = 0}^\infty \frac{1}{k!} = e \]

So, we get

\[ \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} = e \]

Later, Akiva Weinberger pointed out to me that this identity can be shown by a simple reindexing:

\[ \sum_{k = 0}^\infty \frac{k^{\underline{i}}}{k!} \; = \; \sum_{k = i}^\infty \frac{1}{(k-i)!} \; = \; \sum_{j = 0}^\infty \frac{1}{j!} \; = \; e \]

By the way, Dobiński's formula, written this way:

\[ \sum_{k = 0}^\infty \frac{k^n}{k!} = B_n e \]

says the cardinality of the groupoid of finite sets equipped with a function from a fixed finite set is \(e\) times the number of partitions of that set. After a while this starts to seem obvious, if you keep thinking about epi-mono factorizations.

So groupoid cardinality establishes a nice connection between Poisson distributions, partitions, and injections. I doubt I've gotten to the bottom of it, since I haven't really seen why the combinatorics is connected to Poisson processes. I'm also not sure how much all this will help me with permutations. But it was fun.

Addendum. It turns out there's a much shorter proof of Dobriński's formula if we know the generating function for the species of partitions.

Let's recall that first. There's a species of partitions, \(\mathrm{Part}\), such that \(\mathrm{Part}(X)\) is the set of partitions of the finite set \(X\). Since the Bell numbers count partitions,

\[ |\mathrm{Part}|(x) = \sum_{n \ge 0} \frac{|\mathrm{Part}(n)|}{n!} x^n = \sum_{n \ge 0} \frac{B_n}{n!} x^n \]

To put a partition on a finite set amounts to chopping it into a finite set of nonempty finite sets, so

\[ \mathrm{Part} \cong \mathrm{Exp} \circ \mathrm{NE} \]

and thus

\[ |\mathrm{Part}| = |\mathrm{Exp}| \circ |\mathrm{NE}| \]

where \(\mathrm{Exp}\) is the species 'being a finite set' and \(\mathrm{NE}\) is the species 'being a nonempty finite set'. Since

\[ |\mathrm{Exp}|(x) = \sum_{n \ge 0} \frac{x^n}{n!} = e^x \]

and

\[ |\mathrm{NE}|(x) = \sum_{n \ge 1} \frac{x^n}{n!} = e^x - 1 \]

we have

\[ |\mathrm{Part}|(x) = e^{e^x - 1} \]

and thus

\[ \sum_{n \ge 0} \frac{B_n}{n!} x^n = e^{e^x - 1} \]

Now let's use this to prove Dobiński's formula! We start by calculating the same generating function another way. We have

\[ e^{e^x} = \sum_{k \ge 0} \frac{e^{kx}}{k!} = \sum_{k \ge 0} \frac{1}{k!} \sum_{n \ge 0} \frac{(kx)^n}{n!} \]

so

\[ e^{e^x - 1} = \frac{1}{e} \sum_{k \ge 0} \frac{1}{k!} \sum_{n \ge 0} \frac{(kx)^n}{n!} \]

The coefficient of \(x^n\) in this power series must be \(B_n/n!\), so Dobiński's formula follows: \[ B_n = \frac{1}{e} \sum_{k = 0}^\infty \frac{k^n}{k!} \]


You can also read comments on the n-Category Café, and make your own comments or ask questions there!


© 2019 John Baez
baez@math.removethis.ucr.andthis.edu
home