People like to say this is the "information age" - but just how much information are we talking about? I recently found a nice source of... umm... information about this question:
You have to be careful interpreting these figures. For example, all the information in a year's worth of the Washington Post newspaper is far less than the information capacity of all the newsprint put out by the Washington Post during this year - because there are thousands of identical copies of each newspaper. I have tried to distinguish between these two by saying "capacity" for the latter sort of figure. For example, when I say 130 petabytes is the capacity of all audio tapes produced in 2002, this includes thousands of identical copies of some album by the Backstreet Boys, and also lots of blank tape. And, it's quite possible that I screwed up on some of the items above, because my source makes it a bit hard to tell what's what.
Furthermore, these figures don't count the fact that information is typically not compressed down to the Shannon limit. So, there's not as much info around as this chart suggests. For some figures that try to take compression into account, see How Much Information? 2003.
For example, this compression issue is especially important in my guess at the information in the human genome, and the genomes of all the people in the world. I didn't try to take into account the immense overlap in genetic information between different people, nor the repetitive stretches in human DNA. Here's how I did the calculation. Each of us has chromosomes with about 5 billion base pairs. Each base pair holds 2 bits of information: A, T, C, or G. That's 10 billion bits, or 1.25 gigabytes. Times the roughly 6.5 billion people in the world now, we get about 8 x 10^{18} bytes, or 8 exabytes. They only built 2 exabytes of hard disks in 2002. But, if we wanted to store the complete genetic identity of everyone on hard drives, we could easily do it, using data compression, because a lot of genes are the same from person to person.
And of course, some of these figures are rough order-of-magnitude guesses, like "all the words ever spoken by human beings".
It would be fun to know how much information is in people's brains, but I don't have the knowhow to estimate that!
I just read from an unreliable source that the human eye has a resolution equivalent to that of 127 million pixels (whatever that means), and an input rate of 1 gigabit per second. This source also said that the human brain can store 10 terabytes of information. But, I don't trust these figures at all without knowing how they were calculated.
I read somewhere else that the human brain has 10^{14} synapses. If each stored just one bit (not true), that would be about 10 terabytes.
By the way, the information it takes to completely describe a single raindrop vastly exceeds all the figures above. This sort of information is called entropy, and there's a lot of it around.
How much? Just for fun, let's compute the information it takes to completely describe everything about a raindrop: its precise quantum state.
This sounds hard, but it's not if we let ourselves look up a few numbers. First of all, the entropy of water! At room temperature (25 degrees Celsius) and normal pressure (1 atmosphere), the entropy of a mole of water is 69.91 joules per kelvin.
To understand this, first you need to know that chemists like "moles" - and by a mole, I don't mean that fuzzy creature that ruins your lawn: I mean a certain ridiculously large number of molecules or atoms, invented to deal with the fact that even a tiny little thing is made of lots of atoms. By definition, a mole is about the number of atoms in one gram of hydrogen.
A guy named Avogadro figured out that this number is about 6.023 x 10^{23}. People now call this Avogadro's number. So, a mole of water means 6.023 x 10^{23} molecules of water. And since a water molecule is 18 times heavier than a hydrogen atom, this is 18 grams of water.
So, if we prefer grams to moles, the entropy of a gram of water is is 69.91/18 = 3.88 joules per kelvin.
What does that have to do with information? Well, Boltzmann, Shannon and others figured out how entropy and information are related, and formula is pretty simple: one nit of information equals 1.3808 x 10^{-23} joules per kelvin of entropy. This number is called Boltzmann's constant.
What's a "nit" of information? Well, bits of information are a good unit when you're using binary notation - 0's and 1's - but trits would be a good unit if you were using base 3, and so on. For physics the most natural unit is a "nit", where we use base e. So, the "n" in "nit" stands for "natural".
Don't get in a snit over the fact that we can't actually write numbers using base e - if you do, I'll just say you're nitpicking! The point is, information in the physical world is not digital, it's analog - so base e turns out to be the best.
Okay: so, by taking the reciprocal of Boltzmann's constant we see that one joule per kelvin of entropy equals 7.24 x 10^{22} nits of information. By the way, I don't want to explain why entropy is measured in joules per kelvin - that's another fun story.
That's all we need to look up. We can now just multiply and see that a gram of water (at room temperature and pressure) holds
3.88 x 7.24 x 10^{23} = 2.81 x 10^{24} nits
of information. In other words, this is how much information it takes to completely specify the state of one gram of water.
Or if you prefer bits, use the fact that a bit equals ln(2) or .693 nits. Dividing by this, we see a gram of water holds
4.05 x 10^{24} bits
of information.
Now what about a raindrop? Well, I guess it depends how big a raindrop is! If I type "size of raindrop" into the all-knowing Google, I get a webpage helpfully titled Mass of a Raindrop by Glenn Elert, which shows masses ranging from .004 to 100 milligrams. Wow!
Oh well, I will assume a 1-milligram raindrop, just because it's simple. This holds
4.05 x 10^{21} bits
of information. Dividing by 8, we get
5 x 10^{20} bytes
of information. (The whole idea of bytes seems a bit silly to me, but especially when applied to a raindrop). Or if you prefer:
500 exabytes!
This is 100 times the information in every word ever spoken by human beings throughout the course of history. We would need to talk for much longer to completely describe a raindrop.
It's also 250 times the capacity of all the hard disks produced in 2002.
One thing all this goes to show is that the sheer amount of information has little to do with how useful or interesting it is. Another thing, though, is that all our magnificent methods of information storage come nowhere close to handling the information packed in even the simplest things of nature.
We might start with something smaller than a raindrop - say, a human red blood cell. How much information would it take to completely describe one of these?
I can't look up the entropy of red blood cells, but while hemoglobin and lipids are a lot more interesting than an equivalent mass of water, their entropy is probably not vastly different: this is just how chemistry works. So, if we just want an order-of-magnitude estimate, we can do an easier problem: compute the information in a red-blood-cell sized droplet of water.
A red blood cell is about 10^{-2} centimeters, so its volume is roughly 10^{-6} cubic centimeters. So, let's consider an equivalent volume of water. This has a mass of 10^{-6} grams - a microgram - since a cubic centimeter of water weighs (almost exactly) a gram.
We computed the information of milligram of water just a minute ago in our raindrop problem. So, we just need to divide by 1000 to get the information content of a microgram of water. Easy! The answer is
500 petabytes!
We saw a while back that about 440 petabytes of email were sent in 2002. So, the information required to completely describe a red blood cell is on the order of this: all the email in the world, back in 2002.
© 2005 John Baez
baez@math.removethis.ucr.andthis.edu