You can see the slides for all these talks here, and videos too.
Here are some kinds of networks I discuss:
You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!
To read more about the network theory project, go here:
You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!
For more details on signal-flow graphs and control theory, see:
You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!
For more details, try this free book:
You can also see the slides of this talk. Click on any picture in the slides, or any text in blue, and get more information!
For more details on entropy and information, try:
For something a bit lighter, try these blog articles:
In my talk I mistakenly said that relative entropy is a continuous functor; in fact it's just lower semicontinuous. I have fixed my slides.
The third part of my talk was my own interpretation of Brendan Fong's master's thesis:
I take a slightly different approach, by saying that a causal theory $\mathcal{C}_G$ is the free category with products on certain objects and morphisms coming from a directed acyclic graph $G$. In his thesis he merely said $\mathcal{C}_G$ was the free symmetric monoidal category where each generating object is equipped with a cocommutative comonoid structure. This is close to a category with finite products, but not quite the same: a symmetric monoidal category where every object is equipped with a cocommutative comonoid structure in a natural way (i.e., making a bunch of squares commute) is a category with finite products. It would be interesting to see if this difference hurts or helps.By making this slight change, I am claiming that causal theories can be seen as algebraic theories in the sense of Lawvere, and this would be a very good thing, since we know a lot about those.