# March 6, 2005 {#week211} The last time I wrote an issue of this column, the Huyghens probe was bringing back cool photos of Titan. Now the European "Mars Express" probe is bringing back cool photos of Mars! 1) Mars Express website, `http://www.esa.int/SPECIALS/Mars_Express/index.html` There are some tantalizing pictures of what might be a "frozen sea" --- water ice covered with dust --- near the equator in the Elysium Planitia region: $$\href{http://www.esa.int/SPECIALS/Mars_Express/SEMCHPYEM4E_0.html}{\includegraphics[max width=0.65\linewidth]{../images/mars_packice.jpg}}$$ 2) Mars Express sees signs of a "frozen sea", `http://www.esa.int/SPECIALS/Mars_Express/SEMCHPYEM4E_0.html` Ice has already been found at the Martian poles --- it's easily visible there, and Mars Express is getting some amazing closeups of it now - here's a here's a view of some ice on sand at the north pole: $$\href{http://www.esa.int/SPECIALS/Mars_Express/SEMLF6D3M5E_1.html}{\includegraphics[max width=0.65\linewidth]{../images/mars_pole.jpg}}$$ 3) Glacial, volcanic and fluvial activity on Mars: latest images, `http://www.esa.int/SPECIALS/Mars_Express/SEMLF6D3M5E_1.html` What's new is the possibility of large amounts of water in warmer parts of the planet. Now for some math. It's always great when two subjects you're interested in turn out to be bits of the same big picture. That's why I've been really excited lately about Bott periodicity and the "super-Brauer group". I wrote about Bott periodicity in ["Week 105"](#week105), and about the Brauer group in ["Week 209"](#week209), but I should remind you about them before putting them together. Bott periodicity is all about how math and physics in n+8-dimensional space resemble math and physics in $n$-dimensional space. It's a weird and wonderful pattern that you'd never guess without doing some calculations. It shows up in many guises, which turn out to all be related. The simplest one to verify is the pattern of Clifford algebras. You're probably used to the complex numbers, where you throw in just *one* square root of $-1$, called $i$. And maybe you've heard of the quaternions, where you throw in *two* square roots of $-1$, called $i$ and $j$, and demand that they anticommute: $$ij = -ji$$ This implies that $k = ij$ is another square root of $-1$. Try it and see! In the late 1800s, Clifford realized there's no need to stop here. He invented what we now call the "Clifford algebras" by starting with the real numbers and throwing in n square roots of $-1$, all of which anticommute with each other. The result is closely related to rotations in $n+1$ dimensions, as I explained in ["Week 82"](#week82). I'm not sure who first worked out all the Clifford algebras --- perhaps it was Cartan --- but the interesting fact is that they follow a periodic pattern. If we use $C_n$ to stand for the Clifford algebra generated by n anticommuting square roots of $-1$, they go like this: | $n$ | $C_n$ | | :-- | :---- | | $0$ | $\mathbb{R}$ | | $1$ | $\mathbb{C}$ | | $2$ | $\mathbb{H}$ | | $3$ | $\mathbb{H}\oplus\mathbb{H}$ | | $4$ | $\mathbb{H}(2)$ | | $5$ | $\mathbb{C}(4)$ | | $6$ | $\mathbb{R}(8)$ | | $7$ | $\mathbb{R}(8)\oplus\mathbb{R}(8)$ | where: - $\mathbb{R}(n)$ means $n \times n$ real matrices, - $\mathbb{C}(n)$ means $n \times n$ complex matrices, and - $\mathbb{H}(n)$ means $n \times n$ quaternionic matrices. All these become algebras with the usual addition and multiplication of matrices. Finally, if $A$ is an algebra, $A \oplus A$ consists of pairs of guys in $A$, with pairwise addition and multiplication. What happens next? Well, from then on things sort of "repeat" with period 8: $C_{n+8}$ consists of $16 \times 16$ matrices whose entries lie in $C_n$! So, you can remember all the Clifford algebras with the help of this eight-hour clock: $$ \begin{tikzpicture} \draw (0,0) circle[radius=2.65cm]; \node[label=below:{$\mathbb{R}$}] at (90:2.3) {0}; \node[label=below left:{$\mathbb{C}$}] at (45:2.3) {1}; \node[label=left:{$\mathbb{H}$}] at (0:2.3) {2}; \node[label={[label distance=-2mm]above left:{$\mathbb{H}\oplus\mathbb{H}$}}] at (-45:2.3) {3}; \node[label=above:{$\mathbb{H}$}] at (-90:2.3) {4}; \node[label=above right:{$\mathbb{C}$}] at (-135:2.3) {5}; \node[label=right:{$\mathbb{R}$}] at (180:2.3) {6}; \node[label={[label distance=-2mm]below right:{$\mathbb{R}\oplus\mathbb{R}$}}] at (135:2.3) {7}; \foreach \a in {0,45,90,135,180,-135,-90,-45} \draw (\a:2.5) to (\a:2.65); \end{tikzpicture} $$ To use this clock, you have to remember to use matrices of the right size to get $C_n$ to have dimension $2^n$. So, when I write "$\mathbb{R}\oplus\mathbb{R}$" next to the "7" on the clock, I don't mean $C_7$ is really $\mathbb{R}\oplus\mathbb{R}$. To get $C_7$, you have to take $\mathbb{R}\oplus\mathbb{R}$ and beef it up until it becomes an algebra of dimension $2^7 = 128$. You do this by taking $\mathbb{R}(8)\oplus\mathbb{R}(8)$, since this has dimension $8\times8 + 8\times8 = 128$. Similarly, to get $C_{10}$, you note that 10 is 2 modulo 8, so you look at "2" on the clock and see "$\mathbb{H}$" next to it, meaning the quaternions. But to get $C_{10}$, you have to take $\mathbb{H}$ and beef it up until it becomes an algebra of dimension $2^{10} = 1024$. You do this by taking $\mathbb{H}(16)$, since this has dimension $4\times16\times16 = 1024$. This "beefing up" process is actually quite interesting. For any associative algebra $A$, the algebra $A(n)$ consisting of $n \times n$ matrices with entries in $A$ is a lot like $A$ itself. The reason is that they have equivalent categories of representations! To see what I mean by this, remember that a "representation" of an algebra is a way for its elements to act as linear transformations of some vector space. For example, $\mathbb{R}(n)$ acts as linear transformations of $\mathbb{R}^n$ by matrix multiplication, so we say $\mathbb{R}(n)$ has a representation on $R^n$. More generally, for any algebra $A$, the algebra $A(n)$ has a representation on $A^n$. More generally still, if we have any representation of $A$ on a vector space $V$, we get a representation of $A(n)$ on $V^n$. It's less obvious, but true, that *every* representation of $A(n)$ comes from a representation of $A$ this way. In short, just as $n \times n$ matrices with entries in $A$ form an algebra $A(n)$ that's a beefed-up version of $A$ itself, every representation of $A(n)$ is a beefed-up version of some representation of $A$. Even better, the same sort of thing is true for maps between representations of $A(n)$. This is what we mean by saying that $A(n)$ and $A$ have equivalent categories of representations. If you just look at the categories of representations of these two algebras as abstract categories, there's no way to tell them apart! We say two algebras are "Morita equivalent" when this happens. It's fun to study Morita equivalence classes of algebras --- say algebras over the real numbers, for example. The tensor product of algebras gives us a way to multiply these classes. If we just consider the invertible classes, we get a *group*. This is called the "Brauer group" of the real numbers. The Brauer group of the real numbers is just $\mathbb{Z}/2$, consisting of the classes $[\mathbb{R}]$ and $[\mathbb{H}]$. These correspond to the top and bottom of the Clifford clock! Part of the reason is that $$\mathbb{H}\otimes\mathbb{H} = \mathbb{R}(4)$$ so when we take Morita equivalence classes we get $$[\mathbb{H}]\times[\mathbb{H}] = [\mathbb{R}]$$ But, you may wonder where the complex numbers went! Alas, the Morita equivalence class $[\mathbb{C}]$ isn't invertible, so it doesn't live in the Brauer group. In fact, we have this little multiplication table for tensor product of algebras: | $\otimes$ | $\mathbb{R}$ | $\mathbb{C}$ | $\mathbb{H}$ | | :-------- | :----------: | :----------: | :----------: | | $\mathbb{R}$ | $\mathbb{R}$ | $\mathbb{C}$ | $\mathbb{H}$ | | $\mathbb{C}$ | $\mathbb{C}$ | $\mathbb{C}\oplus\mathbb{C}$ | $\mathbb{C}(2)$ | | $\mathbb{H}$ | $\mathbb{H}$ | $\mathbb{C}(2)$ | $\mathbb{R}(4)$ | Anyone with an algebraic bone in their body should spend an afternoon figuring out how this works! But I won't explain it now. Instead, I'll just note that the complex numbers are very aggressive and infectious --- tensor anything with a $\mathbb{C}$ in it and you get more $\mathbb{C}$'s. That's because they're a field in their own right --- and that's why they don't live in the Brauer group of the real numbers. They do, however, live in the *super-Brauer* group of the real numbers, which is $\mathbb{Z}/8$ --- the Clifford clock itself! But before I explain that, I want to show you what the categories of representations of the Clifford algebras look like: $$ \begin{tikzpicture} \draw (0,0) circle[radius=2.65cm]; \node[label=below:{\scriptsize real}] at (90:2.3) {0}; \node[label={[label distance=-2mm]below left:{\scriptsize complex}}] at (45:2.3) {1}; \node[label=left:{\scriptsize quaternionic}] at (0:2.3) {2}; \node at (-45:2.3) {3}; \node at (1,-1) {\scriptsize split}; \node at (1,-1.32) {\scriptsize quaternionic}; \node[label=above:{\scriptsize quaternionic}] at (-90:2.3) {4}; \node[label={[label distance=-2mm]above right:{\scriptsize complex}}] at (-135:2.3) {5}; \node[label=right:{\scriptsize real}] at (180:2.3) {6}; \node[label={[label distance=-2mm]below right:{\scriptsize split real}}] at (135:2.3) {7}; \foreach \a in {0,45,90,135,180,-135,-90,-45} \draw (\a:2.5) to (\a:2.65); \end{tikzpicture} $$ Here, the labels at each hour describe the type of vector space, e.g. at 3-o'clock we have split quaternionic vector spaces. You can read this information off the 8-hour Clifford clock I showed you before, at least if you know some stuff: - A real vector space is just something like $\mathbb{R}^n$ - A complex vector space is just something like $\mathbb{C}^n$ - A quaternionic vector space is just something like $\mathbb{H}^n$ and a "split" vector space is a vector space that's been written as the direct sum of two subspaces. Take $C_4$, for example --- the Clifford algebra generated by 4 anticommuting square roots of $-1$. The Clifford clock tells us this is $\mathbb{H}\oplus\mathbb{H}$. And if you think about it, a representation of this is just a pair of representations of $\mathbb{H}$. So, it's two quaternionic vector spaces --- or if you prefer, a "split" quaternionic vector space. Or take $C_7$. The Clifford clock says this is $\mathbb{R}\oplus\mathbb{R}$... or at least Morita equivalent to $\mathbb{R}\oplus\mathbb{R}$: it's actually $\mathbb{R}(8)\oplus\mathbb{R}(8)$, but that's just a beefed-up version of $\mathbb{R}\oplus\mathbb{R}$, with an equivalent category of representations. So, the category of representations of $C_7$ is *equivalent* to the category of split real vector spaces. And so on. Note that when we loop all the way around the clock, our Clifford algebra becomes $16\times16$ matrices of what it was before, but this is Morita equivalent to what it was. So, we have a truly period-8 clock of categories! But here's the really cool part: there are also arrows going clockwise and counterclockwise around this clock! Arrows between categories are called "functors". Each Clifford algebra is contained in the next one, since they're built by throwing in more and more square roots of $-1$. So, if we have a representation of $C_n$, it gives us a representation of $C_{n-1}$. Ditto for maps between representations. So, we get a functor from the category of representations of $C_n$ to the category of representations of $C_{n-1}$. This is called a "forgetful functor", since it "forgets" that we have representations of $C_n$ and just thinks of them as representations of $C_{n-1}$. So, we have forgetful functors cycling around counterclockwise! Even better, all these forgetful functors have "left adjoints" going back the other way. I talked about left adjoints in ["Week 77"](#week77), so I won't say much about them now. I'll just give an example. Here's a forgetful functor: $$\mbox{complex vector spaces}\xrightarrow{\mbox{\scriptsize forget complex structure}}\mbox{real vector spaces}$$ which is one of the counterclockwise arrows on the Clifford clock. This functor takes a complex vector space and forgets your ability to multiply vectors by $i$, thus getting a real vector space. When you do this to $\mathbb{C}^n$, you get $\mathbb{R}^{2n}$. This functor has a left adjoint: $$\mbox{complex vector spaces}\xleftarrow{\mbox{\scriptsize complexify}}\mbox{real vector spaces}$$ where you take a real vector space and "complexify" it by tensoring it with the complex numbers. When you do this to $\mathbb{R}^n$, you get $\mathbb{C}^n$. So, we get a beautiful version of the Clifford clock with forgetful functors cycling around counterclockwise and their left adjoints cycling around clockwise! When I realized this, I drew a big picture of it in my math notebook --- I always carry around a notebook for precisely this sort of thing. Unfortunately, it's a bit hard to draw this chart in ASCII, so I won't include it here. Instead, I'll draw something easier. For this, note the following mystical fact: the Clifford clock is symmetrical under reflection around the 3-o'clock/7-o'clock axis. It seems bizarre at first that it's symmetrical along *this* axis instead of the more obvious 0-o'clock/4-o'clock axis. But there's a good reason, which I already mentioned: the Clifford algebra $C_n$ is related to rotations in $n+1$ dimensions. I would be very happy if you had enough patience to listen to a full explanation of this fact, along with everything else I want to say. But I bet you don't... so I'll hasten on to the really cool stuff. First of all, using this symmetry we can fold the Clifford clock in half... and the forgetful functors on one side perfectly match their left adjoints on the other side! So, we can save space by drawing this "folded" Clifford clock: $$ \begin{tikzpicture}[yscale=1.5] \node at (0,0) {split real vector spaces}; \draw[->] (-0.25,-0.2) to node[label=left:{\scriptsize forget splitting}]{} (-0.25,-0.8); \draw[<-] (0.25,-0.2) to node[label=right:{\scriptsize double}]{} (0.25,-0.8); \node at (0,-1) {real vector spaces}; \draw[->] (-0.25,-1.2) to node[label=left:{\scriptsize complexify}]{} (-0.25,-1.8); \draw[<-] (0.25,-1.2) to node[label=right:{\scriptsize forget complex structure}]{} (0.25,-1.8); \node at (0,-2) {complex vector spaces}; \draw[->] (-0.25,-2.2) to node[label=left:{\scriptsize quaternionify}]{} (-0.25,-2.8); \draw[<-] (0.25,-2.2) to node[label=right:{\scriptsize forget quaternionic structure}]{} (0.25,-2.8); \node at (0,-3) {quaternionic vector spaces}; \draw[->] (-0.25,-3.2) to node[label=left:{\scriptsize double}]{} (-0.25,-3.8); \draw[<-] (0.25,-3.2) to node[label=right:{\scriptsize forget splitting}]{} (0.25,-3.8); \node at (0,-4) {split quaternionic vector spaces}; \end{tikzpicture} $$ The forgetful functors march downwards on the right, and their left adjoints march back up on the left! The arrows going between 7 o'clock and 0 o'clock look a bit weird: $$ \begin{tikzpicture}[yscale=1.5] \node at (0,0) {split real vector spaces}; \draw[->] (-0.25,-0.2) to node[label=left:{\scriptsize forget splitting}]{} (-0.25,-0.8); \draw[<-] (0.25,-0.2) to node[label=right:{\scriptsize double}]{} (0.25,-0.8); \node at (0,-1) {real vector spaces}; \end{tikzpicture} $$ Why is "forget splitting" on the left, where the left adjoints belong, when it's obviously an example of a forgetful functor? One answer is that this is just how it works. Another answer is that it happens when we wrap all the way around the clock --- it's like how going from midnight to 1 am counts as going forwards in time even though the number is getting smaller. A third answer is that the whole situation is so symmetrical that the functors I've been calling "left adjoints" are also "right adjoints" of their partners! So, we can change our mind about which one is "forgetful", without getting in trouble. But enough of that: I really want to explain how this stuff is related to the super-Brauer group, and then tie it all in to the *topology* of Bott periodicity. We'll see how far I get before giving up in exhaustion.... What's a super-Brauer group? It's just like a Brauer group, but where we use superalgebras instead of algebras! A "superalgebra" is just physics jargon for a $\mathbb{Z}/2$-graded algebra --- that is, an algebra $A$ that's a direct sum of an "even" or "bosonic" part $A_0$ and an "odd" or "fermionic" part $A_1$: $$A = A_0 \oplus A_1$$ such that multiplying a guy in $A_i$ and a guy in $A_j$ gives a guy in $A_{i+j}$, where we add the subscripts $\mod 2$. The tensor product of superalgebras is defined differently than for algebras. If $A$ and $B$ are ordinary algebras, when we form their tensor product, we decree that everybody in $A$ commutes with everyone in $B$. For superalgebras we decree that everybody in $A$ "supercommutes" with everyone in $B$ --- meaning that $$ab = ba$$ if either $a$ or $b$ are even (bosonic) while $$ab = -ba$$ if $a$ and $b$ are both odd (fermionic). Apart from these modifications, the super-Brauer group works almost like the Brauer group. We start with superalgebras over our favorite field --- here let's use the real numbers. We say two superalgebras are "Morita equivalent" if they have equivalent categories of representations. We can multiply these Morita equivalence classes by taking tensor products, and if we just keep the invertible classes we get a group: the super-Brauer group. As I've hinted already, the super-Brauer group of the real numbers is $\mathbb{Z}/8$ --- just the Clifford algebra clock in disguise! Here's why: The Clifford algebras all become superalgebras if we decree that all the square roots of $-1$ that we throw in are "odd" elements. And if we do this, we get something great: $$C_n \otimes C_m = C_{n+m}$$ The point is that all the square roots of $-1$ we threw in to get $C_n$ *anticommute* with those we threw in to get $C_m$. Taking Morita equivalence classes, this mean $$[C_n] [C_m] = [C_{n+m}]$$ but we already know that $$[C_{n+8}] = [C_n]$$ so we get the group $\mathbb{Z}/8$. It's not obvious that this is *all* the super-Brauer group, but it actually is --- that's the hard part. Now let's think about what we've got. We've got the super-Brauer group, $\mathbb{Z}/8$, which looks like an 8-hour clock. But before that, we had the categories of representations of Clifford algebras, which formed an 8-hour clock with functors cycling around in both directions. In fact these are two sides of the same coin --- or clock, actually. The super-Brauer group consists of Morita equivalence classes of Clifford algebras, where Morita equivalence means "having equivalent categories of representations". But, our previous clock just shows their categories of representations! This suggests that the functors cycling around in both directions are secretly an aspect of the super-Brauer group. And indeed they are! The functors going clockwise are just "tensoring with $C_1$", since you can tensor a representation of $C_n$ with $C_1$ and get a representation of $C_{n+1}$. And the functors going counterclockwise are "tensoring with $C_{-1}$"... or $C_7$ if you insist, since $C_{-1}$ doesn't strictly make sense, but $7$ equals $-1 \mod 8$, so it does the same job. Hmm, I think I'm tired out. I didn't even get to the topology yet! Maybe that'll be good as a separate little story someday. If you can't wait, just read this: 4) John Milnor, _Morse Theory_, Princeton U. Press, Princeton, New Jersey, 1963. You'll see here that a representation of $C_n$ is just the same as a vector space with $n$ different anticommuting ways to "rotate vector by 90 degrees", and that this is the same as a real inner product space equipped with a map from the $n$-sphere into its rotation group, with the property that the north pole of the $n$-sphere gets mapped to the identity, and each great circle through the north pole gives some action of the circle as rotations. Using this, and stuff about Clifford algebras, and some Morse theory, Milnor gives a beautiful proof that $$\Omega^8(\mathrm{SO}(\infty)) \sim \mathrm{SO}(\infty)$$ or in English: the 8-fold loop space of the infinite-dimensional rotation group is homotopy equivalent to the infinite-dimensional rotation group! The thing I really like, though, is that Milnor relates the forgetful functors I was talking about to the process of "looping" the rotation group. That's what these maps from spheres into the rotation group are all about... but I want to really explain it all someday! I learned about the super-Brauer group here: 5) V. S. Varadarajan, _Supersymmetry for Mathematicians: An Introduction_, American Mathematical Society, Providence, Rhode Island, 2004. though the material here on this topic is actually a summary of some lectures by Deligne in another book I own: 6) P. Deligne, P. Etingof, D.S. Freed, L. Jeffrey, D. Kazhdan, J. Morgan, D.R. Morrison and E. Witten, _Quantum Fields and Strings: A Course For Mathematicians_ 2 vols., American Mathematical Society, Providence, 1999. Notes also available at `http://www.math.ias.edu/QFT/` Varadarajan's book doesn't go as far, but it's much easier to read, so I recommend it as a way to get started on "super" stuff.