What if the fundamental “stuff” of the universe isn’t matter or energy, but information?
That’s the idea some theorists are pursuing as they search for ever-more elegant and concise descriptions of the laws that govern our universe. Could our universe, in all its richness and diversity, really be just a bunch of bits?
To understand the buzz over information, we have to start at the beginning: What is information?
So is an image like this:
And an equation like this:
“It doesn’t matter whether something consists of equations, words, images or sounds—you can encode any of that in strings of zeroes and ones, ” or bits, says Scott Aaronson, associate professor of electrical engineering and computer science at MIT. Your computer is doing it right now, using tiny magnets, capacitors, and transistors to store billions or trillions of binary digits. “These might have been hard concepts for people to understand a century ago, but because of the computer revolution, we deal with these concepts all the time, ” says Aaronson. In an age when a USB drive dangles from every keychain and an iPhone strains the seams of every pocket, it isn’t such a leap to agree that anything can be expressed in information.
To some theorists, though, information is more than just a description of our universe and the stuff in it: it is the most basic currency of existence, occupying what theorist Paul Davies terms the “ontological basement” of reality.
The rules of quantum information provide the most “compact” description of physics, says Vlatko Vedral, professor of quantum information theory at the University of Oxford and the National University of Singapore. “Information, it seems to me, requires fewer assumptions about anything else we could postulate. As soon as you talk about matter and energy, you have to write down the laws that govern matter and energy.”
Does this mean that our universe is made of information, as some headlines claim?
“It strikes me as a contentless question, ” says Aaronson. “To say that matter and energy are important in physics is to say something with content.” You can imagine a universe barren of matter and energy, after all; specifying that our universe is furnished with both tells you something about it and distinguishes it from other possible universes. “But I don’t know how you could even conceive of a universe” without information, he says.
Yet, as a fresh way of thinking about, well, what the universe is about, information has touched off provocative work in computer science and theoretical astrophysics, apparently disparate fields that may share a deep link manifested by that cosmic Rosetta stone, the black hole. But before we dive into the black hole, let’s step back to take a deeper look at information itself.
All messages contain information, but not all messages are created equal. “Unexpected things have high information content, ” says Vedral. Take a sunrise, for example. “If the sun rises tomorrow, you won’t see any newspaper writing about it. But of course if it didn’t rise, it would be a major event.”
We sense intuitively that “surprises, ” like a missed sunrise, carry more information than routine events. Claude Shannon, widely considered the father of information theory, formalized this intuition by defining a quantity that’s now known as “Shannon entropy.” The Shannon entropy of a message is related to the sum of the logarithm of the probability of each bit taking on a particular value. That’s a mouthful, but, Vedral explains, it mathematically captures two important features of information: the value of surprises, and the fact that information is “additive”—that is, that the total information contained in two, three, four, or a billion unrelated events is equal to the sum of the information in each one.
Physicists describe entropy a little differently, often speaking in terms of the “disorder” of a system. More precisely, entropy is the number of different ways you can rearrange the littlest parts of a system and still get the same big system. A bucket full of red Legos, for instance, has high entropy. Shake it up, spin it around, and you still have what you began with: a bucket of red Legos. Assemble those same blocks into a Lego castle, though, and you’ve slashed the entropy; moving a single block nets you a different “macroscopic” system.