Lecture 17 - The Grand Synthesis

Lecture 17 - The Grand Synthesis

In this chapter we learned about left and right adjoints, and about joins and meets. At first they seemed like two rather different pairs of concepts. But then we learned some deep relationships between them. Briefly:

Today we'll conclude our discussion of Chapter 1 with two more bombshells:

This is a good example of how category theory works. You learn a bunch of concepts, but then you learn more and more facts relating them, which unify your understanding... until finally all these concepts collapse down like the core of a giant star, releasing a supernova of insight that transforms how you see the world!

Let me start by reviewing what we'd already seen. To keep things simple let me state these facts just for posets, not the more general preorders. Everything can be generalized to preorders.

In Lecture 6 we saw that given a left adjoint \( f : A \to B\), we can compute its right adjoint using joins:

[ g(b) = \bigvee \{a \in A : \; f(a) \le b \} . ]

Similarly, given a right adjoint \( g : B \to A \) between posets, we can compute its left adjoint using meets:

[ f(a) = \bigwedge \{b \in B : \; a \le g(b) \} . ]

In Lecture 16 we saw that left adjoints preserve all joins, while right adjoints preserve all meets.

Then came the big surprise: if \( A \) has all joins and a monotone function \( f : A \to B \) preserves all joins, then \( f \) is a left adjoint! But if you examine the proof, you'l see we don't really need \( A \) to have all joins: it's enough that all the joins in this formula exist:

[ g(b) = \bigvee \{a \in A : \; f(a) \le b \} . ]

Similarly, if \(B\) has all meets and a monotone function \(g : B \to A \) preserves all meets, then \( f \) is a right adjoint! But again, we don't need \( B \) to have all meets: it's enough that all the meets in this formula exist:

[ f(a) = \bigwedge \{b \in B : \; a \le g(b) \} . ]

Now for the first of today's bombshells: joins are left adjoints and meets are right adjoints. I'll state this for binary joins and meets, but it generalizes.

Suppose \(A\) is a poset with all binary joins. Then we get a function

[ \vee : A \times A \to A ]

sending any pair \(a,a') \in A\) to the join \(a \vee a'\). But we can make \(A \times A\) into a poset as follows:

[ (a,b) \le (a',b') \textrm{ if and only if } a \le a' \textrm{ and } b \le b' ]

Then \( \vee : A \times A \to A\) becomes a monotone map, since you can check that

[ a \le a' \textrm{ and } b \le b' \textrm{ implies } a \vee b \le a' \vee b'. ]

And you can show that \( \vee : A \times A \to A \) is the left adjoint of another monotone function, the diagonal

[ \delta : A \to A \times A ]

sending any \(a \in A\) to the pair \( (a,a) \).

The diagonal function is also called duplication since it duplicates any element of \(A\).

Why is \( \vee \) the left adjoint of \( \Delta \)? If you unravel what this means using all the definitions, it amounts to this fact:

[ a \vee a' \le b \textrm{ if and only if } a \le b \textrm{ and } a' \le b . ]

Note that we're applying \( \vee \) to \( (a,a') \) in the expression at left here, and applying \( \Delta \) to \( b \) in the expression at the right. So, this fact says that \( \vee \) the left adjoint of \( \Delta \).

Puzzle 45. Prove that \( a \le a' \) and \( b \le b' \) imply \( a \vee b \le a' \le b' \). Also prove that \( a \vee a' \le b \) if and only if \( a \le b \) and \( a' \le b \).

A similar argument shows that joins are really right adjoints! If \( A \) is a poset with all binary meets, we get a monotone function

[ \vee : A \to A \times A ]

that's the right adjoint of \( \Delta \). This is just a clever way of saying

[ a \le b \textrm{ or } a \le b' \textrm{ if and only if } a \le b \wedge b' ]

which is also easy to check.

Puzzle 46. State and prove similar facts for joins and meets of any number of elements in a poset - possibly an infinite number.

All this is very beautiful, but you'll notice that all facts come in pairs: one for left adjoints and one for right adjoints. We can squeeze out this redundancy by noticing that every preorder has an "opposite", where "greater than" and "less than" trade places! It's like a mirror world where up is down, big is small, true is false, and so on.

Definition. Given a preorder \( (A , \le) \) there is a preorder called its opposite, \( (A, \ge) \). Here we define \( \ge \) by

[ a \ge a' \textrm{ if and only if } a' \le a ]

for all \( a, a' \in A \). We call the opposite preorder\( A^{\textrm{op}} \) for short.

I can't believe I've gone this far without ever mentioning \( \ge \). Now we finally have really good reason.

Puzzle 47. Show that the opposite of a preorder really is a preorder, and the opposite of a poset is a poset.

Puzzle 48. Show that the opposite of the opposite of \( A \) is \( A \) again.

Puzzle 49. Show that the join of any subset of \( A \), if it exists, is the meet of that subset in \( A^{\textrm{op}} \).

Puzzle 50. Show that any monotone function \(f : A \to B \) gives a monotone function \( f : A^{\textrm{op}} \to B^{\textrm{op}} \): the same function, but preserving \( \ge \) rather than \( \le \).

Puzzle 51. Show that \(f : A \to B \) is the left adjoint of \(g : B \to A \) if and only if \(f : A^{\textrm{op}} \to B^{\textrm{op}} \) is the right adjoint of \( g: B^{\textrm{op}} \to A^{\textrm{ op }}\).

So, we've taken our whole course so far and "folded it in half", reducing every fact about meets to a fact about joins, and every fact about right adjoints to a fact about left adjoints... or vice versa! This idea, so important in category theory, is called duality. In its simplest form, it says that things come on opposite pairs, and there's a symmetry that switches these opposite pairs. Taken to its extreme, it says that everything is built out of the interplay between opposite pairs.

Once you start looking you can find duality everywhere, from ancient Chinese philosophy:

to modern computers:

But duality has been studied very deeply in category theory: I'm just skimming the surface here.

This is the end of my lectures on Chapter 1. There's more in this chapter that we didn't cover, so now it's time for us to go through all the exercises.

To read other lectures go here.


© 2018 John Baez