Information Geometry

John Baez

July 26, 2021

Information geometry is the study of 'statistical manifolds', which are spaces where each point is a hypothesis about some state of affairs. This subject, usually considered a branch of statistics, has important applications to machine learning and somewhat unexpected connections to evolutionary biology. To learn this subject, I'm writing a series of articles on it. You can navigate forwards and back through these using the blue arrows. And by clicking the links that say "on Azimuth", you can see blog entries containing these articles. Those let you read comments about my articles—and also make comments or ask questions of your own!

Eric Auld has created a PDF of some these posts and some other blog articles of mine: Information Geometry

The following papers are spinoffs of the above series of blog articles. You can also read blog articles summarizing these papers: I also have some talks connected to this work:


You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage. - John von Neumann, giving advice to Claude Shannon on what to name his discovery.

© 2016 John Baez
baez@math.removethis.ucr.andthis.edu

home