## Network Theory

Nature and the world of human technology are full of networks. People like to draw diagrams of networks: flow charts, electrical circuit diagrams, signal-flow graphs, Bayesian networks, Feynman diagrams and the like. Mathematically minded people know that in principle these diagrams fit into a common framework: category theory. But we are still far from a unified theory of networks. After an overview, we look at three portions of the jigsaw puzzle in three separate talks:

You can see the slides for all these talks here, and videos too.

Here are some kinds of networks I discuss:

### Overview

You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!

To read more about the network theory project, go here:

### I. Electrical circuits and signal-flow graphs

You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!

For more details on signal-flow graphs and control theory, see:

For more on circuit diagrams, see:

### II. Stochastic Petri nets, chemical reaction networks and Feynman diagrams

You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!

For more details, try this free book:

as well as this paper on the Anderson–Craciun–Kurtz theorem (discussed in my talk):

### III. Entropy, information and Bayesian networks

You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!

For more details on entropy and information, try:

For something a bit lighter, try these blog articles:

• A Characterization of Entropy: how entropy can be characterized up as the unique functor (up to a constant factor) with a few nice properties.
• Relative Entropy (Part 1): how various structures important in probability theory arise naturally when you do linear algebra using only the nonnegative real numbers.
• Relative Entropy (Part 2): a category related to statistical inference, $\mathrm{FinStat}$, and how relative entropy defines a functor on this category.
• Relative Entropy (Part 3): how relative entropy can be characterized as the unique functor (up to a constant factor) with a few nice properties.
• Relative Entropy: a summary of the previous three posts, with much more emphasis on category-theoretic nuances.

In my talk I mistakenly said that relative entropy is a continuous functor; in fact it's just lower semicontinuous. I have fixed my slides.

The third part of my talk was my own interpretation of Brendan Fong's master's thesis:

I take a slightly different approach, by saying that a causal theory $\mathcal{C}_G$ is the free category with products on certain objects and morphisms coming from a directed acyclic graph $G$. In his thesis he merely said $\mathcal{C}_G$ was the free symmetric monoidal category where each generating object is equipped with a cocommutative comonoid structure. This is close to a category with finite products, but not quite the same: a symmetric monoidal category where every object is equipped with a cocommutative comonoid structure in a natural way (i.e., making a bunch of squares commute) is a category with finite products. It would be interesting to see if this difference hurts or helps.

By making this slight change, I am claiming that causal theories can be seen as algebraic theories in the sense of Lawvere, and this would be a very good thing, since we know a lot about those.

© 2014 John Baez (except for images)
baez@math.removethis.ucr.andthis.edu