Lecture 75 - The Grand Synthesis
Let's review our progress, and try to put all the pieces of this course together in a neat package.
We started by returning to a major theme of Chapter 2: enriched categories. We saw that enriched functors between these were just a special case of something more flexible: enriched profunctors. We saw some concrete applications of these, but also their important theoretical role.
Simply put: moving from functors to profunctors is completely analogous to moving from functions to matrices! Thus, introducing profunctors gives category theory some of the advantages of linear algebra.
Recall: a function between sets
[ f \colon X \to Y ]
can be seen as a special kind of \(X \times Y\)-shaped matrix
[ \phi \colon X \times Y \to \mathbb{R} ]
namely one where the matrix entry \(\phi(x,y) \) is \(1\) if \(y = f(x)\), and \(0\) otherwise. In short:
[ \phi(x,y) = \delta_{f(x), y} ]
where \(\delta\) is the Kronecker delta. Composing functions then turns out to be a special case of multiplying matrices. Here I'm using \(\mathbb{R}\) because most of you have seen matrices of real numbers, but we could equally well use \(\mathbf{Bool} = \lbrace \texttt{true}, \texttt{false} \rbrace \), and get matrices of truth values, which are just relations. Matrix multiplication has the usual composition of relations as a special case!
Similarly, a \(\mathcal{V}\)-enriched functor
[ F \colon \mathcal{X} \to \mathcal{Y} ]
can be seen a special kind of \(\mathcal{V}\)-enriched profunctor
[ \Phi \colon \mathcal{X}^{\text{op}} \times \mathcal{Y} \to \mathcal{V} ]
namely the 'companion' of \(F\), given by
[ \Phi(x,y) = \mathcal{V}(F(x), y) .]
This is a fancier relative of the Kronecker delta! For matrices of booleans \( \delta_{f(x), y} = \texttt{true}\) iff \(f(x) = y\), but \( \mathcal{V}(F(x), y) = \texttt{true}\) iff \(f(x) \le y \).
The analogy is completed by this fact: the formula for composing enriched profunctors is really just matrix multiplication written with less familiar symbols:
[ (\Psi\Phi)(x,z) = \bigvee_{y \in \mathrm{Ob}(\mathcal{Y})} \Phi(x,y) \otimes \Psi(y,z). ]
Here \(\bigvee\) plays the role of a sum and \(\otimes\) plays the role of multiplication.
To clarify this analogy, we studied the category \(\mathbf{Prof}_\mathcal{V}\) with
- \(\mathcal{V}\)-enriched categories as objects
and
- \(\mathcal{V}\)-enriched profunctors as morphisms.
We saw that it was a compact closed category. This means that you can work with morphisms in this category using string diagrams, and you can bend the strings around using caps and cups. In short, \(\mathcal{V}\)-enriched profunctors are like circuits made of components connected by flexible pieces of wire, which we can stick together to form larger circuits.
And while you may not have learned it in your linear algebra class, this 'flexibility' is exactly one of the advantages of linear algebra! For any field \(k\) (for example the real numbers \(\mathbb{R}\)) there is a category \(\mathrm{FinVect}_k\) with
- finite-dimensional vector spaces over \(k\) as objects
and
- linear maps as morphisms.
This category is actually equivalent to the category with finite sets as objects and \(k\)-valued matrices as morphisms, where we compose matrices by matrix multiplication. And like \(\mathbf{Prof}_\mathcal{V}\), the category \(\mathbf{Vect}_k\) is compact closed, as I mentioned last time. So, while a function between sets has a rigidly defined 'input' and 'output' (i.e. domain and codomain), a linear map between finite-dimensional vector spaces can be 'bent' or 'turned around' in various ways - as you may have first seen when you learned about the transpose of a matrix.
There's one other piece of this story whose full significance I haven't quite explained yet.
We've seen pairs of adjoint functors, and we've seen 'duals' in compact closed categories. In fact they're closely related! There is a general concept of adjunction that has both of these as special cases! And adjunctions give something all you functional programmers have probably been wishing I'd talk about all along: monads. So I'll try to explain this next time.
To read other lectures go here.