
There’s a problem with the Big Bang theory.
No, I’m not suggesting the Big Bang theory is wrong. There are a few scientists who dispute it*, but this post isn’t about that. And it isn’t about the TV show either.** This is about the growing mystery in the field of cosmology about the expansion rate of the universe—and, by extension, the age of the universe. I mentioned two weeks ago that cosmologists have figured out a new way to measure this expansion, but does this method solve the mystery, or only deepen it?
There are two main methods to measure the expansion rate of the universe. (There are others, but these are the most important, and the others are mostly in line with one or the other.) One is to observe a particular type of supernova called a Type Ia supernova. Type Ia’s have a distinctive spectrum, and they’re all about the same brightness***, so if you measure their apparent brightness and their redshift, you get a pretty good measure of how fast the universe was expanding at the time the star exploded. However, this method only lets you trace about halfway back to the beginning of the universe before the supernovae get too faint to measure accurately.
The second method is to look at the cosmic microwave background (CMB). The tiny variations in the afterglow of the Big Bang, measured by a microwave-frequency telescope, provide information about how fast the universe was expanding in the first few hundred thousand years of its life. These variations truly are tiny—one part in ten thousand, in an afterglow that is only three degrees above absolute zero—but they still allow the expansion of the universe to be measured precisely.
The rate of expansion is expressed in a number called the Hubble constant, which tells how fast galaxies are moving away from us as a function of distance. And here’s the problem. Measurements of the CMB say the Hubble constant is 67.4+/-0.5 kilometers per second per megaparsec. (Don’t worry about the units; they aren’t important for this discussion.) This would mean that the universe is 13.8 billion years old, which is the number you usually see in print. But measurements of supernovae and nearer types of stars like Cepheids say that the Hubble constant is 73.3+/-0.8, and that the universe is only 12.7 billion years old. There’s some variation, of course; one particular method using red giant stars is smack in the middle, but the point is, the numbers don’t match.
Now, we come to the new method. A team of cosmologists at Clemson University figured out a new way of measuring the expansion of the universe using gamma rays. The theory is that when very, very high-energy gamma rays produced by things like black holes in the early universe interact with starlight—not gas and dust, but other photons—they can turn into electron-positron pairs. This makes some of the gamma rays disappear, which means their sources appear dimmer than they should be, and if you compare observations of the brightness of these sources with theoretical models, you get an absolute measure of the time the gamma rays spent in transit. Match up this amount of time with the redshift, and you’re golden. Surprisingly, these models are accurate enough to give a pretty good measure of the expansion of the universe. So how do they measure up? They match the early-universe CMB data: 67.5.
This is especially interesting because in a way, this method measures the age of the universe directly instead of the expansion rate. Most methods of determining distances in the universe (and therefore its expansion) rely on “standard candles” like the Type Ia supernovae. But these only tell us the relative distances to similar types of objects. There are other methods—multiple image lensing, baryon acoustic oscillations, and megamaser orbits (yes, those are real words)—that do measure absolute distances because we can calculate them with trigonometry instead of just brightness, but there’s still a lot of variation in them that makes it hard to measure the Hubble constant precisely.
However, this gamma ray method is the first one I’ve seen that makes an absolute measure of time. Granted, it’s still described as a distance measure, but for light, it’s pretty much the same thing. Either way, this is a new absolute (not relative) distance measure, which (like baryon acoustic oscillations) agrees with the early universe data.
So, what’s going on? Which number is right? Well, on one hand, there is a loophole here that lets us say, “maybe both.” Note that earlier, I said we were measuring the expansion rate of the universe at a particular time (and interpreting it as a single number in the present). In the simplest models of the universe, that number shouldn’t be different when we look back at different times, but there are many ways that it could be. Cosmologists have plenty of ideas about dark energy that could make that happen.
On the other hand, if the higher number for the nearby universe is true, meaning a younger universe, we have a problem: some of the stars aren’t old enough. We have various ways of measuring the ages of stars, and they tell us that the oldest stars are about 13.2 billion years old. That works nicely if the universe is 13.8 billion years old, but not so much if it’s only 12.7 billion. That’s why I made a big deal about measuring absolute time. The gamma ray measurement is a measure of how much space-time the gamma rays actually had to pass through to reach us, and that measurement agrees with the older age of the universe. So, even if the Hubble constant does change over time, we can be a little more confident that we’ve nailed down the age of the universe correctly.
*And there are Creationists who cite them, even though the main alternative to the Big Bang theory is not creationism, but a steady-state model where the universe has always existed and was uncreated. But that’s another story.
**Pro tip: “The Big Bang Theory” with capital T’s refers to the TV show; the “Big Bang theory” with lowercase T’s refers to the scientific theory.
***Actually, this is an oversimplification. Type Ia supernovae vary quite a bit in brightness, but they are very tightly correlated between brightness and how long they take to fade away.