0;136;0c information

## Information

#### April 14, 2020

People like to say this is the "information age". But just how much information are we talking about? I recently found a nice source of... umm... information about this question:

They got some of their figures from here: Here's a little chart of data from these and other sources:

• A bit is the information in one binary decision: a no or yes, a 0 or 1.

• 5 bits: approximate information in one letter of the Roman alphabet.
• 7.11 bits: information in a molecule of water in the form of ice.
• 12.14 bits: information in a molecule of liquid water at standard temperature and pressure.
• 32.76 bits: information in a molecule of water vapor at standard temperature and pressure.

• A byte is 8 bits.

• A kilobyte is a thousand bytes.

• 2 kilobytes: a typewritten page.
• 100 kilobytes: a low-resolution photograph.

• A megabyte is a million bytes.

• 1 megabyte: a small novel or a 3.5 inch floppy disk.
• 2 megabytes: a high-resolution photograph.
• 5 megabytes: the complete works of Shakespeare.
• 100 megabytes: one meter of shelved books.
• 500 megabytes: a CD-ROM.

• A gigabyte is a billion bytes or 230.

• 1.25 gigabytes: the human genome, or a pickup truck full of books.
• 4.78 gigabytes: a 1-gram black hole.
• 20 gigabytes: a good collection of the works of Beethoven.
• 100 gigabytes: a library floor of academic journals.

• A terabyte is a trillion bytes.

• 2 terabytes: an academic research library.
• 6 terabytes: all academic journals printed in 2002.
• 10 terabytes: the print collections of the U.S. Library of Congress.
• 40 terabytes: all books printed in 2002.
• 50 terabytes: all mass market periodicals printed in 2002.
• 60 terabytes: all audio CDs released in 2002.
• 80 terabytes: capacity of all floppy discs produced in 2002.
• 140 terabytes: all newspapers printed in 2002.
• 170 terabytes: the searchable portion of the World-Wide Web in 2002.
• 250 terabytes: capacity of all zip drives produced in 2002.

• A petabyte is 1015 bytes.

• 1.5 petabytes: all office documents generated in 2002.
• 2 petabytes: all U.S academic research libraries.
• 6 petabytes: all cinema release films in 2002.
• 20 petabytes: all X-ray photographs taken in 2002.
• 90 petabytes: the "Deep Web" in 2002 — includes private databases, controlled-access sites and so on.
• 130 petabytes: capacity of all audio tapes produced in 2002.
• 400 petabytes: all photographs taken in 2002.
• 440 petabytes: all emails sent in 2002.
• 500 petabytes: information required to completely describe a single human red blood cell, down to every subatomic particle.

• An exabyte is 1018 bytes.

• 1.3 exabytes: capacity of all videotapes produced in 2002.
• 2 exabytes: capacity of all hard disks produced in 2002.
• 5 exabytes: all the words ever spoken by human beings.
• 9.75 exabytes: information in the genomes of all the people in the world in 2020.
• 330 exabytes: total capacity of all hard drives sold by Seagate Technology in 2011.

• A zettabyte is 1021 bytes.

• A yottabyte is 1024 bytes.

• 1 yottabyte: a million city block-sized data centers using terabyte hard drives built using the technology available in 2010.
• 500 yotttabytes: the information needed to completely describe a liter of water at room temperature.

• 1.364 × 1066 bits: the amount of information in a black hole whose mass equals that of the Earth.

• 1.381 × 1069 bits: the amount of information in a black hole whose event horizon is 1 square meter in area.

• 1.514 × 1077 bits: the information needed to completely describe a black hole the mass of the Sun.

• 4 × 1099 bits: the information needed to completely describe the black hole in the galaxy Holmberg 15A, which has a mass roughly 170 billion times that of our Sun.

• 5 × 10104 bits: the entropy of the observable universe as roughly estimated by Egan and Lineweaver.

• 10124 bits: the most information we could fit into the observable Universe.

### Warnings

You have to be careful interpreting these figures.

First, while some are precise, others are rough order-of-magnitude guesses, like "all the words ever spoken by human beings".

Second, the information in a year's worth of the New York Times is far less than the information capacity  of all the newsprint put out by the New York Times during this year — because there are thousands of identical copies of each newspaper. I have tried to distinguish between these two by saying "capacity" for the latter sort of figure. For example, when I say 130 petabytes is the capacity of all audio tapes produced in 2002, this includes thousands of identical copies of some album by the Backstreet Boys, and also lots of blank tape. It's quite possible that I screwed up on some of the items above, because my source makes it a bit hard to tell what's what.

A third problem is that not everyone agrees on the definition of 'kilobyte', 'megabyte', 'gigabyte', and so on! Originally people used 'kilobyte' to mean not 1000 but 1024 bytes, since 1024 = 210, and powers of 2 are nicer when you're using binary. The difference is just 2.4% percent. But by the time you get to 'zettabytes' it makes a bigger difference: 280 is about 20.8% more than 1024.

There have, in fact, been lawsuits over hard drives that contained only 109 bytes per gigabyte, instead of 230. And at some point, defenders of the metric system tried to crack down on the evil practice of using prefixes like 'kilo-', 'mega-', giga-' to mean something other than powers of ten. 210 bytes is now officially a kibibyte, 220 bytes is a mebibyte, and so on. But I haven't heard people actually say these words.

Finally, the figures above don't count the fact that information is typically not compressed as much as possible. For example, there are almost 32 = 25 letters in English, which would suggest an information of roughly 5 bits per letter. But some letters are used more than others, and taking advantage of this lets us cut the information down to about 4.1 bits per letter. We can compress text further using correlation between letters. Taking correlations within each 8-letter block into account, Shannon estimated in 1948 that there are just 2.3 bits per letter. In 1950 he went further and estimated that taking correlations in 100-letter blocks into account, each letter holds only 1 bit of information.

So, there's not as much info around as this chart suggests. For some figures that try to take compression into account, see How Much Information?   2003.

### The human genome

The compression issue is quite important for my guess at the information of all the genomes of all the people in the world. I ignored this in my calculations. Here's what I did, just so you know.

Each of us has chromosomes with a total of about 5 billion base pairs. Each base pair can hold 2 bits of information: A, T, C, or G. That's 10 billion bits, or 1.25 gigabytes.

(Here and in everything that follows, I'll use the decimal system, so a gigabyte will be exactly 109 bits, and so on.)

Multiplying this by the roughly 7.5 billion people in the world now, we get about 9 × 1018 bytes, or 9 exabytes. They only built 2 exabytes of hard disks in 2002. But, if we wanted to store the complete genetic identity of everyone on hard drives, we could easily do it, using data compression, because a lot of genes are the same from person to person.

Furthermore, we can compress any one particular genome! For starters, the genetic code codes for 22 different amino acids using triplets of base pairs. That reduces the information to at most ln(22)/ln(2) ≈ 4.5 bits per triplet, or 1.5 bits per base pair.

There are also long patches of highly repetitive DNA, such as long terminal repeats. On top of that, at least 20% of human DNA consists of numerous copies of parasitic genes that do nothing but make copies of themselves, like long interspersed nuclear elements! So you could compress the human genome a lot more if you wanted.

### The eye, the brain

I once read from an unreliable source that the human eye has a resolution equivalent to that of 127 million pixels (whatever that means), and an input rate of 1 gigabit per second. This source also said that the human brain can store 10 terabytes of information. But, I don't trust these figures at all without knowing how they were calculated.

I read somewhere else that the human brain has 1014 synapses. If each stored just one bit — which is far from true — that would be about 10 terabytes.

### A gram of water

But now let's talk about some serious big data. It would take 500,000,000 petabytes of data to completely describe one gram of water, down to the positions and velocities of the individual subatomic particles...

... limited, of course, by the Heisenberg uncertainty principle! That's what makes the amount of information finite.

How can we calculate this? It sounds hard, but it's not if you look up a few numbers.

First of all, the entropy of water! At room temperature (25 degrees Celsius) and normal atmospheric pressure, the entropy of a mole of water is 69.95 joules per kelvin.

To understand this, first you need to know that chemists like moles. By a mole, I don't mean that fuzzy creature that ruins your lawn. I mean a certain ridiculously large number of molecules or atoms, invented to deal with the fact that even a tiny little thing is made of lots of atoms. A mole is very close to the number of atoms in one gram of hydrogen.

A guy named Avogadro figured out that this number is about 6.022 × 1023. People now call this number the Avogadro constant. So, a mole of water is 6.022 × 1023 molecules of water. And since a water molecule is 18 times heavier than a hydrogen atom, this is 18 grams of water.

So, if we prefer grams to moles, the entropy of a gram of water is 69.95/18 = 3.89 joules per kelvin. By the way, I don't want to explain why entropy is measured in joules per kelvin — that's another fun story.

But what does all this have to do with information? Well, Boltzmann, Shannon and others figured out how entropy and information are related. The formula is pretty simple: one nat of information equals 1.3808 × 10-23 joules per kelvin of entropy. This number is called Boltzmann's constant.

What's a 'nat' of information? Well, bits of information are a good unit when you're using binary notation — 0's and 1's — but trits would be a good unit if you were using base 3, and so on. For physics  the most natural unit is a nat, where we use base e. So, 'nat' stands for 'natural'. A bit equals ln(2) nats, or approximately .6931 nats.

Don't get in a snit over the fact that we can't actually write numbers using base e — if you do, I'll just say you're nitpicking, or natpicking! The point is, information in the physical world is not binary — so base e turns out to be the best.

Okay: so, by taking the reciprocal of Boltzmann's constant we see that one joule per kelvin of entropy equals 7.24 × 1022 nats of information.

That's all we need to look up. We can now just multiply and see that a gram of water (at room temperature and pressure) holds $$3.89 \times 7.24 \times 10^{23} \; = \; 2.81 \times 10^{24}$$

nats of information. In other words, this is how much information it takes to completely specify the state of one gram of water!

Or if you prefer bits, use the fact that a bit is .6931 nats. Dividing by this, we see a gram of water holds 4.05 × 1024 bits. Dividing by 8, that's about 5 × 1023 bytes, or 500 zettabytes, or 500,000,000 petabytes.

One thing this goes to show is that the sheer amount  of information has little to do with how useful or interesting it is. It also shows that all our magnificent methods of information storage come nowhere close to handling the information packed in even the simplest things of nature.

Of course, the problem is that there are a lot of water molecules in a gram of water. As I mentioned, a mole of water has 6.022 × 1023 molecules, and its entropy is 69.91 joules per kelvin; to convert that to bits of we divide by Boltzmann's constant and ln(2). So, the number bits of information it takes to describe one molecule of liquid water — its position, momentum, angular momentum, its precise shape and so on on — is $$\frac{69.95}{6.022 \times 10^{23} \times 1.3808 \times 10^{-23} \times 0.6931} \; = \; 12.14$$

bits. That's a more manageable number! By the way, for ice the figure is 7.11 bits per molecule, while for water vapor it's 32.76 bits per molecule.

Here's what I find most cool, though: 500 zettabytes capture everything about what's happening in the cubic centimeter of space and the gram of liquid water in it — unless there are things going on that have almost no interaction with anything we can observe. The great thing about entropy is that it's comprehensive: the entropy of water did not change when we discovered protons were made of quarks, and it will not change if we discover quarks are made of sub-quarks.

Only with quantum mechanics is this 'backwards-compatibility' of entropy possible. In classical mechanics it doesn't work: the more little parts something has, the more entropy it has. Classicaly, even the entropy of something so simple as an ideal gas works out to be infinite, since it takes an infinite amount of information to precisely specify the position and velocity of a point particle in a box!

You see, classical mechanics is sick when it comes to thermodynamics. Planck got famous for curing the 'ultraviolet divergence' that makes the energy of a classical box of light infinite in thermal equilibrium. The infinity you get when you calculate the entropy of a classical ideal gas is almost as disturbing, though for some reason less well-known.

### The blood cell

Let's look at something smaller — say, a human red blood cell. How much information would it take to completely describe one of these?

I can't look up the entropy of red blood cells, but while hemoglobin and lipids are a lot more interesting than an equivalent mass of water, their entropy is probably not vastly different: this is just how chemistry works. So, if we just want an order-of-magnitude estimate, we can do an easier problem: compute the information in a red-blood-cell sized droplet of water.

A red blood cell is about 10-2 centimeters, so its volume is roughly 10-6 cubic centimeters. So, let's consider an equivalent volume of water. This has a mass of 10-6 grams — a microgram — since a cubic centimeter of water weighs (almost exactly) a gram.

We computed the information in gram of water just a minute ago. So, we just need to divide by a million to get the information content of a microgram of water. Easy The answer is: 500 petabytes!

About 440 petabytes of email were sent in 2002. So, all the emails sent that year would be almost, but not quite, enough to completely describe a red blood cell down to the subatomic level.

### The black hole

An amazing fact is that black holes have an entropy proportional to their surface area — that is, the area of their event horizon. So, it takes information to describe the microscopic details of a black hole. A famous calculation by Stephen Hawking says exactly how much: ¼ a nat of information per 'Planck area'.

The Planck area is a very small unit of area. It's the square of the Planck length, which is about 1.6162 × 10-35 meters. So, there's one nat of information for each $$4 \times (1.6162 \times 10^{-35})^2 \; = \; 1.0448 \times 10^{-69}$$ square meters of black hole event horizon. In other words, there is one bit per $$\ln(2) \times 1.0448 \times 10^{-69} = 7.242 \times 10^{-70}$$ square meters of horizon. Or if you prefer, each square meter of a black hole's horizon takes $$\frac{1}{7.242 \times 10^{-70}} \; = \; 1.381 \times 10^{69}$$ bits to describe.

This amazing fact suggests that information about matter that falls into a black hole is 'stored in the event horizon'. However, that's not something we really know. And Hawking didn't need to know this to do his calculation.

The history of this is very interesting. First, people discovered that black holes obey three laws that are very much like the three laws of thermodynamics. The second of these laws says that the surface area of the event horizons of black holes always increases. That's similar to the second law of thermodynamics, which says that entropy increases!

This made people suspect that black holes have entropy proportional to their surface area. And then, in a wonderful feat of ingenuity, Hawking calculated the constant of proportionality! This is why he's famous — along with the fact that he has a wheelchair and a cool-sounding electronic voice.

Anyway, let's use Hawking's result to calculate the information in a few black holes.

For this, it helps to know that for a nonrotating, uncharged black hole, the area of the event horizon is $$4\pi R^2$$, where $$R$$ is its radius. Moreover, its radius is $$R = 2GM/c^2$$ where $$M$$ is the black hole's mass, $$G$$ is the gravitational constant and $$c$$ is the speed of light. Working this out in metric units, the radius $$R$$ in meters is 1.4851 × 10-27 times its mass in kilograms.

So, the area of the event horizon in square meters is $$4 \pi (1.4851 \times 10^{-27} M)^2 \; = \; 2.7717 \times 10^{-53} M^2$$ where $$M$$ is its mass in kilograms. This means that the information required to fully describe this black hole is $$1.381 \times 10^{69} \; \times \; 2.7717 \times 10^{-53} M^2 \; = \; 3.827 \times 10^{16} M^2$$

bits.

So, a 1-gram black hole has an information of 3.827 × 1010 bits, or about 4.78 gigabytes!

This is much less than the information in a gram of water. This is why a very small black hole, like 1-gram black hole, is highly unstable. You could increase the entropy a lot by letting it radiate away and then turning the energy from this radiation into water... and anti-water.

On the other hand, the mass of the Earth is 5.972 × 1024 kilograms, so if the Earth collapsed to form a black hole, the information needed to completely describe that black hole would be $$3.827 \times 10^{16} \; \times \; (5.972 \times 10^{24}) \; = \; 1.364 \times 10^{66}$$

bits. The information goes up as the square of the mass, so now the amount of information is huge. But the black hole would still be small: its radius would be just 0.88 centimeters!

What if the Sun collapsed to form a black hole? The Sun's mass is 1.989 × 1030 kilograms, so there are $$3.827 \times 10^{16} \; \times \; (1.989 \times 10^{30})^2 \; = \; 1.514 \times 10^{77}$$

bits of information in a solar-mass black hole.

What about the biggest black hole we know? The biggest ones have masses that aren't very accurately determined, but the one at the center of the supergiant galaxy Holmberg 15A may have a mass 170 billion times that of our Sun. That's about $$1.7 \times 10^{11} \; \times \; 2 \times 10^{30} \; = \; 3.4 \times 10^{41}$$ kilograms. So, there are roughly $$3.827 \times 10^{16} \; \times \; (3.4 \times 10^{41})^2 \; = \; 4.4 \times 10^{99}$$ bits of information in this black hole. That's almost a googol bits!

### The universe

How much information does it take to describe everything in the universe?

We can't answer this, not yet anyway, because we don't even know if the universe is finite or infinite in size.

So, let's talk about the observable universe. The universe may be infinite in size. But when we look back all the way to when the hot gas of the early universe first cooled down and became transparent, everything we see fits in a finite-sized ball centered at us. That ball has by now expanded to be much larger. This larger present-day ball is called the observable universe.

It should really be called the 'once-observed universe', since we can't see what distant galaxies are doing now. But regardless of what it's called, the radius of this ball is about 4.4 × 1026 meters. How much information could we fit in here?

When you keep trying to stuff more information into some region, eventually you get a black hole. As we've seen, the amount of entropy is then proportional to the surface area of the black hole — and this is just 4π times the black hole's radius squared.

At least this is true if the black hole isn't rotating or charged... and it's sitting in otherwise flat space. Luckily, while the universe is expanding, space at any moment seems close to flat.

So let's roughly work out the surface area of the observable universe, by taking 4π times its radius squared. We get 9.5 × 1054 square meters.

We saw in the last section that each square meter of a black hole's event horizon holds 1.381 × 1069 bits of information. If the entire observable universe were stuffed with enough matter to turn it into a black hole, it would thus hold $$1.381 \times 10^{69} \; \times \; 9.5 \times 10^{54} \; \approx \; 10^{124}$$

bits. This figure is very rough!

Of course, packing this much information into the observable universe would turn it into a black hole! In reality the information is much less. So, 10124 is just an upper bound. But it sets a bound on how many possible quantum states the observable universe has. If the entropy of a system is N bits, its number of quantum states is 2N. So, the observable universe can have at most

2(10124)

10300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

quantum states.

This the biggest number I know that has any physical meaning!

A much harder problem is to estimate the actual entropy of the observable universe. There are a couple of papers on that:

The first paper estimates the entropy of the observable universe at very roughly 10102 bits. The second estimates it at very roughly 5 × 10104 bits. The difference is due to an increased estimate of the number of supermassive black holes: that is, black holes of masses at least 10 million times that of the Sun, living at the centers of galaxies. These dominate the entropy of the universe!

It's also fun to see Egan and Lineweaver's other estimates of the big contributors to entropy in the observable universe:

• stars: 1081 bits.
• interstellar and intergalactic gas and dust: 1082 bits.
• gravitons: 1088 bits.
• neutrinos: 1090 bits.
• photons: 1090 bits.
• stellar black holes: 1098 bits.
• supermassive black holes: 5 × 10104 bits.

Too much information running through my brain,
Too much information driving me insane.
- The Police