![]() |
|
![]() |
I'm not claiming my results are new—indeed I have no idea whether they are, and I'd like to hear from any experts who might know. I'm just claiming that this is some work I did last weekend.
People sometimes worry that if they explain their ideas before publishing them, someone will 'steal' them. But I think this overestimates the value of ideas, at least in esoteric fields like mathematical physics. The problem is not people stealing your ideas: the hard part is giving them away. And let's face it, people in love with math and physics will do research unless you actively stop them. I'm reminded of this scene from the Marx Brothers movie where Harpo and Chico, playing wandering musicians, walk into a hotel and offer to play:
Groucho: What do you fellows get an hour?Chico: Oh, for playing we getta ten dollars an hour.
Groucho: I see...What do you get for not playing?
Chico: Twelve dollars an hour.
Groucho: Well, clip me off a piece of that.
Chico: Now, for rehearsing we make special rate. Thatsa fifteen dollars an hour.
Groucho: That's for rehearsing?
Chico: Thatsa for rehearsing.
Groucho: And what do you get for not rehearsing?
Chico: You couldn't afford it.
So, I'm just rehearsing in public here—but I of course I hope to write a paper about this stuff someday, once I get enough material.
Remember where we were. We had considered a manifold—let's finally give it a name, say
where
All this applies to both classical or quantum mechanics. Crooks wrote down a beautiful formula for this metric in the classical case. But since I'm at the Centre for Quantum Technologies, not the Centre for Classical Technologies, I redid his calculation in the quantum case. The big difference is that in quantum mechanics, observables don't commute! But in the calculations I did, that didn't seem to matter much—mainly because I took a lot of traces, which imposes a kind of commutativity:
In fact, if I'd wanted to show off, I could have done the classical and quantum cases simultaneously by replacing all operators by elements of any von Neumann algebra equipped with a trace. Don't worry about this much: it's just a general formalism for treating classical and quantum mechanics on an equal footing. One example is the algebra of bounded operators on a Hilbert space, with the usual concept of trace. Then we're doing quantum mechanics as usual. But another example is the algebra of suitably nice functions on a suitably nice space, where taking the trace of a function means integrating it. And then we're doing classical mechanics!
For example, I showed you how to derive a beautiful formula for the metric I wrote down a minute ago:
But if we want to do the classical version, we can say Hey, presto! and write it down like this:
What did I do just now? I changed the trace to an integral over some space
In what follows, I'll keep talking about the quantum case, but in the back of my mind I'll be using von Neumann algebras, so everything will apply to the classical case too.
So what am I going to do? I'm going to fix a big problem with the story I've told so far.
Here's the problem: so far we've only studied a special case of the Fisher information metric. We've been assuming our states are Gibbs states, parametrized by the expectation values of some observables
But people like to work a lot more generally. We could look at any smooth function
in this more general situation. Nobody can stop us! But it would be better if we could derive this formula, as before, starting from a formula like the one we had before:
The challenge is that now we don't have observables
Well, you may remember that last time we had
where
was the partition function. Let's copy this idea.
So, we'll start with our density matrix
where
(Note that
Now we can repeat some calculations I did last time. As before, let's take the logarithm of
and then differentiate it. Suppose
Last time we had nice formulas for both terms on the right-hand side above. To get similar formulas now, let's define operators
This gives a nice name to the first term on the right-hand side above. What about the second term? We can calculate it out:
where in the last step we use the chain rule. Next, use the definition of
This is just what we got last time! Ain't it fun to calculate when it all works out so nicely?
So, putting both terms together, we see
or better:
This is a nice formula for the 'fluctuation' of the observables
From here on out, it's easy. As before, we can define
Using the formula
we get
or
Voilà!
When this matrix is positive definite at every point, we get a Riemanian metric on
Differential geometers like to use
Differential geometers like coordinate-free formulas, so let's also give a coordinate-free formula for our metric. Suppose
Here
So, this is all very nice. To conclude, two more points: a technical one, and a more important philosophical one.
First, the technical point. When I said
We can't really take the logarithm of every density matrix. Remember, we take the log of a density matrix by taking the log of all its eigenvalues. These eigenvalues are ≥ 0, but if one of them is zero, we're in trouble! The logarithm of zero is undefined.
On the other hand, there's no problem taking the logarithm of our density-matrix-valued function
So, we must assume
Second, the philosophical point. Instead of starting with the density matrix
where we cleverly divide by the normalization factor
to get
So we have added a little extra information when switching from
where
is a smooth function. This doesn't change
But it doesn't change their 'fluctuations'
so it doesn't change the metric
This gauge freedom is interesting, and I want to understand it better. It's related to something very simple yet mysterious. In statistical mechanics the partition function
This is just like the split personality of phases in quantum mechanics. On the one hand they 'don't matter': you can multiply a unit vector by any phase and the pure state it defines doesn't change. But on the other hand, changes in phase matter a lot.
Indeed the analogy here is quite deep: it's the analogy between probabilities in statistical mechanics and amplitudes in quantum mechanics, the analogy between
You can read a discussion of this article on Azimuth, and make your own comments or ask questions there!
![]() |
|
![]() |