Cambridge University Press April 2003 ISBN: 0521817226
Jacket blurb:
In this ambitious study, David Corfield attacks the widely held view that it is the nature of mathematical knowledge which has shaped the way in which mathematics is treated philosophically, and claims that contingent factors have brought us to the present thematically limited discipline. Illustrating his discussion with a wealth of examples, he sets out a variety of approaches to new thinking about the philosophy of mathematics, ranging from an exploration of whether computers producing mathematical proofs or conjectures are doing real mathematics, to the use of analogy, the prospects for a Bayesian confirmation theory, the notion of a mathematical research programme, and the ways in which new concepts are justified. His highly original book challenges both philosophers and mathematicians to develop the broadest and richest philosophical resources for work in their disciplines, and points clearly to the ways in which this can be done.
Download a 20 page sample (or from the USA) from the Introduction.
Read a mathematical physicist's opinion of the book. Other reviews.
(Links give further information relevant to the chapters, including new URLs of ones already out of date in the book.)
Part I: Human and Artificial Mathematicians
Part II: Mathematical Uncertainty
Part III: The Growth of Mathematics
Part IV: The Interpretation of Mathematics
The sense of 'realism' I'm driving at in this chapter is close to that of Alain Connes in A View of Mathematics:
The scientific life of mathematicians can be pictured as a trip inside the geography of the "mathematical reality" which they unveil gradually in their own private mental frame.
It often begins by an act of rebellion with respect to the existing dogmatic description of that reality that one will find in existing books. The young "to be mathematician" realize in their own mind that their perception of the mathematical world captures some features which do not fit with the existing dogma. This first act is often due in most cases to ignorance but it allows one to free oneself from the reverence to authority by relying on one's intuition provided it is backed by actual proofs. Once mathematicians get to really know, in an original and "personal" manner, a small part of the mathematical world, as esoteric as it can look at first, their trip can really start. It is of course vital not to break the "fil d'arianne" which allows to constantly keep a fresh eye on whatever one will encounter along the way, and also to go back to the source if one feels lost at times...
It is also vital to always keep moving. The risk otherwise is to confine oneself in a relatively small area of extreme technical specialization, thus shrinking one's perception of the mathematical world and its bewildering diversity.
The really fundamental point in that respect is that while so many mathematicians have been spending their entire life exploring that world they all agree on its contours and on its connexity: whatever the origin of one's itinerary, one day or another if one walks long enough, one is bound to reach a well known town i.e. for instance to meet elliptic functions, modular forms, zeta functions. "All roads lead to Rome" and the mathematical world is "connected".
In other words there is just "one" mathematical world, whose exploration is the task of all mathematicians and they are all in the same boat somehow.
Through this chapter I use Hopf algebras as a running example of a concept understood to be mathematically important. The appendix gives a definition of a Hopf algebra, but this of course gives very little insight into its nature. For this you can read the introduction to Shahn Majid's Foundations of Quantum Group Theory. A recent paper outlines the extent of the attention paid to these entities - over 6000 Hopf algebra related papers written in the past 10 years are registered in the ZMATH database.
I mention on p.22 the idea of Vladimir Arnol'd that behind the connections found between different branches of mathematics there lie the more systematic relations of complexification, quaternionization, symplectization, and contactization acting on large parts of mathematics. He describes this as 'Polymathematics'. (See also another account of these ideas in Lecture 2 of the Toronto Lectures on this page.)
Part I Human and Artificial Mathematicians
One of the lessons of this part - that human mathematicians are tuned to the significance of pieces of mathematics, where computers cannot be - is a case of the general point made by Charles Taylor as follows:
"For the crucial difference between men and machines is not consciousness, but rather the significance feature. We also enjoy consciousness, because we are capable of focussing on the significance things have for us, and above all of transforming them through formulation in language. That is not unimportant; but the crucial thing that divides us from machines is what also separates our lesser cousins the dumb brutes from them, that things have significance for us non-relatively. This is the context in which alone something like consciousness is possible for us, since we achieve it by focussing on the significance of things, principally in language, and this is something we do." (Collected Papers I: 201).
In this chapter I consider what we can learn from the successes achieved by automated theorem provers, and from the difficulties faced by their designers and users. One important distinction to make concerning work on these devices is between attempts to mimic human approaches to proving (e.g., via proof-planning) and attempts to play to computers' strengths of string matching. A database of problems for theorem provers goes by the name TPTP (Thousands of Problems for Theorem Provers).
One of the most notable successes to date is due to the theorem prover EQP, which has assisted in the proof of the conjecture that a set of axioms suggested by Robbins in the 1930s has the same consequences as the axioms for Boolean algebras. Unlike computer-assisted proofs such as that of the 4-colour theorem, this one runs to only a dozen lines, allowing humans the opportunity to extract some meaning from it. Louis Kauffman does this by reformulating EQP's proof using a box notation derived from the work of Charles Peirce and George Spencer-Brown. Kauffman is a knot theorist with very interesting views on, amongst other things, the nature of mathematical notation. The most important lesson from this chapter is that there is far more to proof than establishing the correctness of a result. Mathematicians want to gain understanding of a domain from its theorems and proofs. Even from the unpromising syntactical trace of EQP's output Kauffman claims to be able to understand "EQPs proof with an enjoyment that was very much the same as the enjoyment that I get from a proof produced by a human being".
In this chapter I discuss what we may learn from attempts to
produce mathematical conjectures automatically.
The paper by James Propp discussed in section 3.2, 'Enumeration
of Matchings: Problems and Progress', can be found here.
The paper by Claus Krattenthaler mentioned on p.63 is here.
His page on the formula guessing program RATE mentioned in
footnote 2 is now here.
The paper by Bailey, Borwein and Plouffe discussed in section 3.3 is P123 from here.
For more information on Inductive Logic Programming and scientific discovery see here. You can see a table of homotopy groups of the spheres discussed in section 3.5.
The elaborated schema I mention on p.73 appears on p. 8 of this paper by Ronnie Brown and Tim Porter. The paper is also relevant to section 9.7 and chapter 10.
Michael Atiyah has described mathematics as the "science of analogy". How hard is it to force the constructions from one branch to work effectively in another branch? In this chapter I consider the intricacies of the analogy originated by Dedekind and Weber between number fields and function fields. This was explained by the mathematician André Weil to his sister, the philosopher Simone Weil, in a letter written from Rouen prison in 1940. Martin Krieger has translated this letter and has included it as an appendix to his fascinating Doing Mathematics: convention, subject, calculation, analogy (World Scientific Press, 2003). (My review of his book has appeared in Philosophia Mathematica, 13: 106-111.) He also discusses the trilingual analogy (cf. pp. 218-225).
On p. 98 I refer to Weil likening the mathematician's work to that of the sculptor working on a hard piece of rock whose structure dictates the emerging shape. Readers consulting Krieger's translation may thus be confused to find nothing along these lines, but instead Weil describing the experience of formulating axioms for uniform spaces as follows: "When I invented (I say invented, and not discovered) uniform spaces, I did not have the impression of working with resistant material, but rather the impression that a professional sculptor must have when he plays with a snowman." (Krieger 2003: 304). This forms the perfect contrast to the passage I was alluding to, which occurs in an extract from a letter written a few weeks earlier and tacked onto the end of the first letter as a single entry in Weil's Collected Works. Here he recalls Michelangelo's thought that a block of marble already contains the sculpted work and that the sculptor's task is to remove the excess stone. This is precisely what I am driving at with the idea of 'inherent-structurism'. We see, then, two ends of the range discussed in chapter 9 between the convenient and the natural/essential. For Weil, uniform spaces are a convenient way of unifying a range of mathematics and so an invention; elaborating the trilingual analogy, on the hand, is a process of discovery. Notice then that such different assessments are made by the same person, i.e., we are not dealing with a blanket constructivism or realism.
Why does a partially worked out analogy give mathematicians greater confidence in the Riemann Hypothesis than do 1.5 billion verifications?
Edwin Jaynes' book Probability Theory: The logic of science has now been published by Cambridge University Press.
I make the point in this chapter that a mathematician found to be frequently wrong about their deductive claims will be seen as unreliable, whereas there is no such blame attached if their guesses turn out to be wrong. One might say, however, that the difference comes down to commitment. Whereas you may announce in print that you have a proof of a proposition, assessments of plausibility are frequently not made public, but rather are held privately or expressed anonymously in referees' reports. But there are situations, such as weather forecasting, where responsibility is taken for broadcasted probability assessments. Bayesians have a way of assessing the track record of people making such assessments. For instance, they call a weather forecaster's predictions well-calibrated if it rains on roughly 10% of the occasions on which they predict a 10% chance of rain, 20% of the occasions on which they predict a 20% chance of rain, etc. Mathematicians must similarly vary as to the accuracy of their hunches of what is correct or likely to work. Choosing a well-calibrated supervisor is rather important for a doctoral student.
I've recently discovered an article written by James Franklin, a mathematician at the University of New South Wales, entitled "Non-deductive Logic in Mathematics", British Journal for the Philosophy of Science 38: 1-18 (available here). This deals similarly with George Polya's idea that mathematicians spend much of their time reasoning non-deductively, assessing the likelihood that a result is true, or that they or their students have the capability of proving something true. What is very surprising is that none of the people I consulted about my own attempts on this score (chaps. 5 and 6 of my book) seemed to be aware of it.
Taking van Fraassen's assertion seriously,
"To accept a theory is to make a commitment, a commitment to the further confrontation of new phenomena within the framework of the theory, a commitment to a research programme, and a wager that all relevant phenomena can be accounted for without giving up that theory." (The Scientific Image: 88),
it is clearly the case that this willingness to make such a wager may depend on your confidence in as yet unestablished mathematical claims.
Timothy Chow (MIT) pointed out to me a great example of the effect of old evidence in the confidence gained by Edward Witten and other string theorists in their theory through its accounting for gravity."Each of the five string theories predicts gravity (plus quantum mechanics): that is, these theories predict a structure that looks just like general relativity at long distances, with corrections (unfortunately unmeasurably small in practice) proportional to a'. This is very striking, since, as I have stressed, standard quantum field theory makes gravity impossible. It is the single most important reason for the intensive study of string theory in the last generation." (Magic, Mystery, and Matrix)
See how Lakatos misses a large part of the story about the way mathematicians proceed. Concept-stretching can still take place after axiomatisation.
For more on the Jaffe-Quinn debate see Michael Stoeltzner's (2000) 'What Lakatos Could Teach the Mathematical Physicist'.
How might Lakatos have translated his scientific research programme construction over to mathematics?
In section 8.6 I make a comparison between my ideas and the means-end analysis of mathematics proposed by Penelope Maddy in her Naturalism in Mathematics. Given her attention to the process of doing real mathematics, it is very fitting that Maddy's book has won the 2002 Lakatos Award.
This chapter is an exploration of the ways in which mathematicians argue about the relative importance of what they study. In an earlier age I might have extended the title by adjoining 'or The Conceptualisation of Mathematical Importance'. I study the case of the groupoid notion. To some it is just an artificial amalgamation of the notions of equivalence relation and group. To others it is a convenient concept in a range of situations where groups are not quite up to the job. To a third group, it is a 'natural' concept, which allows richer forms of symmetry to be detected and which should be studied in its own right. Debates surrounding the validity of examples of this promotion process are rich sources for the philosopher of real mathematics.
At stake is a challenge to the notion that to be mathematically worthwhile a new concept must allow the solution of old problems. An example of someone making this challenge occurs when Gian-Carlo Rota, discussing the reception of Grassman's exterior algebra, notes that it was met by the standard question:
"What can you prove with exterior algebra that you cannot prove without it?" Whenever you hear this question raised about some new piece of mathematics, be assured that you are likely to be in the presence of something important. In my time, I have heard it repeated for random variables, Laurent Schwartz' theory of distributions, ideles and Grothendieck's schemes, to mention only a few. A proper retort might be: "You are right. There is nothing in yesterday's mathematics that could not also be proved without it. Exterior algebra is not meant to prove old facts, it is meant to disclose a new world. Disclosing new worlds is as worthwhile a mathematical enterprise as proving old conjectures." (Indiscrete Thoughts, 48)
For more on the material in section 9.7 see Ronnie Brown's 'Towards non commutative algebraic topology'.
Physicists are currently debating the value of groupoids for their discipline. Look for the thread 'Symmetry groups versus symmetry groupoids' in recent months of the sci.physics.research newsgroup archive.
Here I discuss four reasons why philosophers should investigate higher-dimensional algebra (or higher-dimensional category theory):
(a)
Many important constructions may profitably be seen as the
categorification of familiar constructions.
(b) It provides a way of organising a considerable proportion of
mathematics.
(c) These constructions have applications in mathematics,
computer science, and physics.
(d) Higher-dimensional algebra blurs the distinction between
topology and algebra. Pieces of algebraic notation are taken as
dimensioned topological entities inhabiting a space.
While category theorists have earned themselves the reputation of being somewhat fanatical, those I have met to date have been thoroughly urbane types. John Baez has written acres of insightful exposition about mathematics and physics, including the role of n-categories in topological quantum field theory. Tom Leinster has just made available on the web - Higher Operads, Higher Categories - a book with a very accessible introduction. Ronnie Brown offers a useful account of higher dimensional group theory.
Diagrammatic notation seems to be springing up all over the place, see the birdtracks approach to group theoretic calculations and papers classified as quantum algebra. To see examples of the kinds of diagrams I discuss on page 256 of my book, take a look at this postscript file.
Please e-mail me with any comments you have on the book
Back to my home page.
See my blog.