Information Geometry
John Baez
July 26, 2021
Information geometry is the study of 'statistical manifolds', which are
spaces where each point is a hypothesis about some state of affairs.
This subject, usually considered a branch of statistics, has important
applications to machine learning and somewhat unexpected connections
to evolutionary biology. To learn this subject, I'm writing a series
of articles on it. You can navigate forwards and back through these
using the blue arrows. And by clicking the links that say "on
Azimuth", you can see blog entries containing these articles. Those
let you read comments about my articles—and also make comments
or ask questions of your own!
Eric Auld has created a PDF of some these posts and some other blog articles
of mine: Information Geometry
-
Part 1 - the Fisher information metric from statistical mechanics.
-
Part 2 - connecting the statistical mechanics approach to the usual definition of the Fisher information metric.
-
Part 3 - the Fisher information metric on any manifold equipped with a map to the mixed states of some system.
-
Part 4 - the Fisher information metric as the real part of a complex-valued quantity whose imaginary part measures quantum uncertainty.
-
Part 5 - an example: the harmonic oscillator in a heat bath.
-
Part 6 - relative entropy.
-
Part 7 - the Fisher information metric as the matrix of second derivatives of relative entropy.
-
Part 8 - information geometry and evolution: how natural selection resembles Bayesian inference, and how it's related to relative entropy.
-
Part 9 - information geometry and evolution: the replicator equation and the decline of entropy as a successful species takes over.
-
Part 10 - information geometry and evoluton: how entropy changes under the replicator equation.
-
Part 11 - information geometry and evolution: the decline of relative information.
-
Part 12 - information geometry and evolution: an introduction to evolutionary game theory.
-
Part 13 - information geometry and evolution: the decline of relative information as a population approaches an evolutionarily stable state.
-
Part 14 - open Markov processes and the principle of minimium dissipation. (Joint with Blake Pollard.)
-
Part 15 - how relative entropy changes in open Markov processes. (Joint with Blake Pollard.)
-
Part 16 - an updated version of Fisher's fundamental theorem of natural selection, linking the replicator equation and the Fisher information metric.
-
Part 17 - symplectic and contact geometry in thermodynamics.
-
Part 18 - symplectic and contact geometry in probability theory.
-
Part 19 - surprisal as analogous to momentum, and an overview of the analogy between classical mechanics, thermodynamics and probability theory.
-
Part 20 - developing the connection between thermodynamics and probability theory using statistical manifolds.
-
Part 21 - the basic properties of the Gibbs distribution, which maximizes entropy subject to constraints on expected values of a list of random variables.
The following papers are spinoffs of the above series of blog articles. You can also read blog articles summarizing these papers:
-
Blake Pollard, A
Second Law for open Markov processes, Open Systems
and Information Dynamics 23 (2016), 1650006. (Blog
article here.)
-
John Baez, Brendan Fong and Blake
Pollard, A compositional
framework for Markov processes, Jour. Math. Phys. 57
(2016), 033301. (Blog
article here.)
-
John Baez and Blake Pollard, Relative entropy in biological systems,
Entropy 18 (2016), 46. (Blog article here.)
-
Blake Pollard, Open Markov
processes: A compositional perspective on non-equilibrium steady
states in biology, Entropy 18 (2016), 140. (Blog article here.)
-
John Baez, The
fundamental theorem of natural selection.
I also have some talks connected to this work:
-
Diversity, entropy and thermodynamics,
Exploratory Conference on the Mathematics of Biodiversity, Centre de Recerca Matemàtica, July 5, 2012.
-
Information and entropy in biological systems, NIMBioS Investigative Workshop: Information and Entropy, National Institute for Mathematical and Biological Synthesis, April 4, 2015.
-
Biodiversity, entropy and thermodynamics, Biological
and Bio-Inspired Information Theory, Banff International Research
Station, October 29, 2014.
-
Information and entropy in biological systems, NIMBioS Investigative Workshop: Information and Entropy, National Institute for Mathematical and Biological Synthesis, April 4, 2015.
-
Biology as information dynamics, Biological Complexity: Can it be Quantified?, Beyond Center, February 2, 2017.
You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage. -
John von Neumann, giving advice to Claude Shannon on what to name his discovery.
© 2016 John Baez
baez@math.removethis.ucr.andthis.edu