These are notes on a reworking of the theory of BN-pairs (as given in Brown's book) through a systematic study of the interplay between Boolean group algebras and Boolean Hecke algebras. One benefit of this approach might be a basis for comparing various notions of "model" under development (e.g., buildings in the sense of Brown or Dolan, as compared to Boolean or Heyting Hecke algebras). Another might be to cast the Brown/Tits theory of BN-pairs in more congenial (or less eye-glazing) terms. First a small comment which relates to our last discussion: while it's true that Brown initially includes S (the set of involutions generating the Weyl group) as part of the data of a BN-pair, he proves later in the chapter that S is uniquely determined (as the set of w in W such that B union BwB is a subgroup of G). So you were right: S is described by *properties* of a BN-pair. So for us the data will be (G, B, N) where B and N are subgroups which jointly generate G, and T = B intersect N is normal in N. Define W := N/T. Before stating the Hecke algebra-based axioms, I'll make some general remarks: (1) There is a sup-preserving map between power sets [ ]: 2[W] --> 2[B\G/B] sending w to BwB (where BwB := Bw'B for any w' in N which maps to w under the projection N --> W). I claim that even under the minimal hypotheses given above, this is a lax homomorphism from the Boolean group algebra to the Boolean Hecke algebra. To see this more clearly, let's express the Boolean Hecke algebra structure directly in terms of 2[B\G/B], starting from the formula J_[k] (gB) = Sum_(g' in gB) g'kB (the normalizing factor 1/|B| being absorbed in the Boolean case). Composing, we obtain J_[k'] J_[k] (gB) = J_[k'] Sum_(g' in gB) g'kB = Sum_(g' in gB) J_[k'] (g'kB) = Sum_(g' in gB) Sum_(g'' in g'kB) g''k'B = Sum_(g' in gB) [ Sum_(b in B) g'kbk'B] which makes it clear that J_[k'] J_[k] = Sum_(b in B) J_[kbk']. Let me abbreviate the right side to J_[kBk'], and the last equation to [k].[k'] = [kBk'] := Sum_(b in B) [kbk'] where [k] denotes the orientation BkB for any element k in G, and [k].[k'] denotes multiplication in 2[B\G/B], here taken as the multiplication which is *opposite* to composition of the random jump operators J_[k]. We have [ww'] <= [wBw'] = [w].[w'] so that [ ]: 2[W] --> 2[B\G/B] is a (normal) lax homomorphism. (If you don't like passing to the opposite of the algebra of random jump operators, another option is to consider [ ] as a lax homomorphism from 2[W^{op}] and use the inversion isomorphism 2[W] ~ 2[W^{op}]. But I'll stick to how I'm doing it here.) (2) We will define the subset S of W to be the set of elements w of W for which [w]^2 = [w] + 1 is a minimal polynomial equation satisfied in the Boolean Hecke algebra. (This means [w] + 1 is a nontrivial idempotent which, translated into Brown's language, means that BwB union B is a submonoid of G. Under the BN-pair axiom that each such w is an involution in W, this is actually a subgroup of G.) BN-pair axioms ---------------------- (1) The subset S defined above is a set of involutions which generates W. (2) For s in S and w in W, the following holds in the Boolean Hecke algebra: [s].[w] <= [w] + [sw]. I hope these conditions seem motivated and intuitively reasonable. Axiom (2) says that when we start with a flag f in the orientation class described by [w] (oriented with respect to a flag stabilized by B) and apply a random jump which changes a single feature of f corresponding to [s], either we stay in the same orientation class, or we jump into a Bruhat cell or chamber which is s-adjacent to the one described by [w]. Remark: These axioms are not quite the same as those given by Brown/Tits. In particular, the assumption that the elements of S are involutions, and the assumption that their minimal polynomial equations in the Hecke algebra are given as above, may be dropped, just by adding the assumption that there exists a set S of generators of W such that axiom (2) is satisfied, and also that sBs^{-1} is not contained in B for any S. The equations s^2 = 1 in the group algebra and s^2 = s + 1 in the Hecke algebra are derivable from these assumptions (and S is uniquely determined as the set of elements obeying these equations). As usual in axiomatics, the degree to which one applies Occam's razor is a matter of personal aesthetics -- at the expense of some extra whittling down, I have chosen a formulation which emphasizes the interactions between the group and Hecke algebras. In practice it is probably no harder to verify the axioms as given here than it is to verify the Brown/Tits axioms, so the difference is probably harmless. Consequences ---------------------- (a) One consequence is a complete description of parabolic subgroups (i.e. subgroups intermediate between B and G -- Brown actually reserves the word 'parabolic' for a subgroup conjugate to one between B and G) on the basis of the axioms. First we show that unions of double cosets BwB, where w ranges over words generated by a subset S' of S, are parabolic subgroups. Or, translated into the language of Hecke algebras, that Sum_(w in) [w] is idempotent for each subset S' of S. (The idempotency says that the union of such double cosets [w] is the submonoid generated by B and {s: s in S'}, but since elements s in S are involutions, it's actually a subgroup.) It suffices that [w].[w'] <= Sum_(w in) [w] whenever w, w' belong to. This is shown by induction on the length of a word s1...sd used to write w. The case d = 0 is obvious. The inductive step is an easy consequence of the lax homomorphism and Axiom (2): [s1(s2...sd)].[w'] <= [s1].[s2...sd].[w'] (lax homomorphism) <= [s1]. Sum_(w in) [w] (inductive hypothesis) <= Sum_(w in) [w] + [(s1)w] (axiom (2)) = Sum_(w in) [w]. QED In particular, the union of the double cosets [w] over all w in W (which Brown writes as BWB) is a subgroup of G, and since B and N jointly generate G, this subgroup must be G. I will save for later the proof of Proposition 1. All intermediate subgroups between B and G are described by idempotents in the Hecke algebra of this form. (Hence they are in one-one correspondence with subsets S' of S.) (b) Our next task will be to establish the Bruhat decomposition of G. We had just observed that the union of the double cosets BwB over all w in W is G, i.e. that [ ]: 2[W] --> 2[B\G/B] is surjective. Now we prove this map is injective, i.e. that [w] = [w'] implies w = w'. The proof is by induction on d = min{d(w), d(w')} where *d(w)* is the minimum length of a word in S which evaluates to w. We may assume d = d(w'). If d = 0, then w' = 1, and if [w] = 1, i.e. if Bw'B = B for some w' which maps to w under N --> W, then w' is in B and hence w = 1 by definition of W. If d > 0, write w' in the form sw'' where s is in S and d(w'') = d-1. By assumption, [sw''] = [w] and hence [w''] = [s^2 w''] <= [s].[sw''] = [s].[w] <= [w] + [sw] by axiom (2) and since distinct elements of the form [w] are disjoint, either [w''] = [w] or [w''] = [sw]. By induction, w'' = w or w'' = sw. The first case is impossible since d(w'') < d(w). Hence w'' = sw, whence w' = sw'' = w (using the fact that s is an involution). This completes the proof. (c) Now we prove a result which is used in the proof that the Hecke algebra presentation (recalled below) is correct: Lemma 1: If d(sw) >= d(w), then [s].[w] = [sw]. Proof: By induction on d = d(w). The case d = 0 is clear. For d > 0, write w = w't where t is in S and d(w') = d-1. We will show that ([s].[w]) /\ [w] = 0 so that [s].[w] <= [w] + [sw] (axiom (2)) implies [s].[w] = [sw]. From d(w') + 1 = d(w) <= d(sw) = d(sw't) <= d(sw') + 1 (where the first inequality is the hypothesis of the lemma), we get d(w') <= d(sw'). Therefore, by inductive hypothesis, [sw'] = [s].[w']. Now we calculate ([s].[w]) /\ [w] = ([s].[w't]) /\ [w] <= ([s].[w'].[t]) /\ [w] (lax homomorphism) = ([sw'].[t]) /\ [w] (equation above) <= ([sw'] + [sw't]) /\ [w] (variant of axiom (2)) = ([sw'] + [sw]) /\ [w] = ([sw'] /\ [w]) + ([sw] /\ [w]) The proof is complete if both summands in the last line are 0. Because [ ] is injective (from (b) above), it suffices that sw != w (clear) and that sw' != w -- this follows from w' != sw, which we know from d(w') < d(w) <= d(sw). QED (d) Now we prove our principal theorem on Hecke algebra presentations: Theorem: If s1...sd is a minimal word in S which evaluates to w, then [s1].[s2].(...).[sd] = [w] in the Boolean Hecke algebra. Proof: By induction on d. The case d = 1 is trivial. Now s1...sd is a minimal word only if s2...sd is a minimal word, so [s2].(...).[sd] = [s2...sd] by inductive hypothesis. Since d(s2...sd) = d-1 < d(s1s2...sd) by definition of minimality, the hypothesis of Lemma 1 is satisfied, so that [s1].[s2...sd] = [s1...sd] = [w] and the inductive step goes through. QED Corollary: Under the hypothesis of the preceding theorem, [s1].(...).[sd] = [w] in the Hecke algebra taken over any Q_(+)-algebra as base rig. Proof: It suffices to consider the case where the base rig is Q_(+). Writing out [s1].(...).[sd] as a linear combination of basis elements [w] in B\G/B, [s1].(...).[sd] = Sum_(v in W) a_v [v] (a_v in Q_(+)), we see by applying the change-of-base-rig Q_(+) --> 2 that the only nonzero a_v occurs when v = w (using the theorem). This coefficient a_v = 1 by conservation of probability. QED (e) In serious applications of the theorem and corollary of (d), we need to take advantage of the fact that (W, S) is a Coxeter system. To this end we first prove a companion to Lemma 1: Lemma 2: If d(w) >= d(sw), then [s].[w] = [w] + [sw]. Proof: Since d(ssw) >= d(sw), Lemma 1 gives [w] = [ssw] = [s].[sw] so that [s].[w] = [s]^2 .[sw] = ([s] + 1).[sw] (defining property of S) = [s].[sw] + [sw] = [w] + [sw]. QED To prove that (W, S) is a Coxeter system, we verify a somewhat technical necessary and sufficient condition called the "folding condition" (which I don't understand yet, but which intuitively has to do with "folding" a Coxeter complex along a wall, i.e. onto the half-space on the side of the wall which contains the "preferred" chamber stabilized by B): Folding Condition: Given w in W and s, t in S such that d(sw) = d(w) + 1 = d(wt), either d(swt) = d(w) + 2 or swt = w. Verification: Suppose d(swt) < d(w) + 2. Then by Lemmas 1 and 2, [s].[w].[t] = [s].[wt] = [wt] + [swt] [s].[w].[t] = [sw].[t] = [sw] + [swt]. Using the injectivity of [ ]: 2[W] --> 2[B\G/B], the summands in each of these equations are disjoint, and it follows that [wt] = [sw], whence (again by injectivity) wt = sw. Therefore swt = w. QED Having proven that (W, S) is a Coxeter system, we may give a presentation of the Hecke algebra (over any Q_(+)-algebra), at least in the case where G is finite. Here the minimal polynomial of the orientation [s] for s in S is [s]^2 = (1/q)((q-1)[s] + [1]) where q is the cardinality of the Bruhat cell which is s-adjacent to the 1-point identity cell B in the "flag manifold" G/B. Aside from these, the remaining equations of the presentation are of the form [s].[t].[s]... = [t].[s].[t]... where sts... and tst... are alternating words of length m(s, t) in letters s, t in S, and where m(s, t) is the order of st in W. (To apply the theorem above, we need to know these alternating words are minimal; this follows from Tits's solution of the word problem for the presentation.) I am not quite sure what modifications might be needed when G is non-finite, particularly with analogues of the quadratic equations for [s]. (f) Finally we return to Proposition 1, on the characterization of parabolic subgroups. Lemma 3: If s1...sd is a minimal word which evaluates to w, then the subgroup of G generated by w and B contains the elements s1, ..., sd. Therefore (Proposition 1) every subgroup intermediate between B and G is of the form Sum_(w' in) [w'] = Sum_(w' in) Bw'B for some subset S' of S. Proof: By induction on d. The case d = 0 is trivial. For d > 0, s2...sd is a minimal word for (s1)w, so by inductive hypothesis {s2, ..., sd} is contained in <(s1)w, B>, and also we have d(s1 w) < d(w). Hence Lemma 2 applies: [s1].[w] = [w] + [s1 w] so [s1].[w] /\ [w] != 0. In other words, B(s1)BwB and BwB have a common element. It follows that s1 belongs toand also (as noted above) {s2, ..., sd} is contained in <(s1)w, B> is contained in = which completes the inductive step. For the second statement of the lemma, let P be a subgroup between B and G. P is a union of double cosets, i.e. there is a uniquely determined subset (in fact a subgroup) T of W such that P = Sum_(w in T) [w] and if we let S' be the set of s in S which occur in a minimal word of any w in T, then the first statement shows S' is contained in P, and therefore T = . QED I think I'll stop here for now. I hope this hasn't made your eyes glaze over too much! Todd