Do the Demographics of Logan’s Run Make Sense?

Logan's Run.jpg

As a companion to this week’s episode of A Reader’s History of Science Fiction, I wanted to take a closer look at the science behind one of the books I’ll be talking about: Logan’s Run by William F. Nolan and George Clayton Johnson.

In Logan’s Run, the world combats overpopulation by euthanizing everyone over the age of 21—a society completely by and for the youth. You may be thinking that number is wrong, but if you are, that’s probably because you’re thinking of the movie. In the movie, which is quite a bit better known, everyone is killed at 30 years old.

I want to take a look at the book, though, because a society where everyone is under 21 seems extreme and unworkable, even though they define adulthood to start at 14. But the really strange part is that Nolan and Johnson write that the youth massively dominated the world’s population before the revolution. As they write in the opening lines to the book:

The seeds of the Little War were planted in a restless summer during the mid-1960s, with sit-ins and student demonstrations as youth tested its strength. By the early 1970s, over 75 percent of the people living on Earth were under twenty-one years of age. The population continued to climb—and, with it, the youth percentage. In the 1980s, the figure was 79.7 percent. In the 1990s, 82.4 percent. In the year 2000,—critical mass.

Logan’s Run was published in 1967, when the fears of overpopulation were at their peak, and at the same time (at least in America), youth activism was becoming a major political force. Nolan and Grayson extrapolate this to suggest that the population boom of the 50s and 60s would lead to a massive rise in the youth population that would give them the power to take over the world.

…But 82.4%? Really?


You have to wonder if this was artistic license. The problems are obvious when you think about it. After all, if 75% of the world population was under 21 in the 70s, they would all be over 21 by the 90s. Either a lot of them died off in their 20s and 30s, or the world population sextupled to 20 or 25 billion by 2000 for a new youth generation of 82.4% to exist.

Still, I had to wonder if it would be possible for a society to get so out of balance in terms of age structure. It is true that the population boom happened because a lot more children were…actually not being born, but surviving. The largest cause of the population “bomb” everyone feared in the 60s was actually a dramatic decline in infant mortality.

For the vast majority of human history, a baby’s odds of living to his or her first birthday were down around 50/50 (although it varied a lot). Today, those odds are up around 97%, and that’s as a global average. That means that for most of human history, people were used to having twice as many babies as they needed to maintain their population because half of them would die. Eliminate infant mortality, and the population doubles in a generation. Eventually, people adjust by having fewer children, but that takes time.

Even with double the number of children, does 82.4% make sense? Well, let’s look at the real world demographics of the time. In 1965, the country with the highest birthrate in the world was Rwanda, at 8.2 children per woman. You can look up its age structure at the time, and it turns out that its population under 21 was…60%.

I looked at some other countries with very high birth rates, and Samoa was at 63%, but I couldn’t find anything higher. When you look at the real-world numbers, even the initial 75% figure from Logan’s Run becomes ludicrous.

But let’s go beyond the real world. In Rwanda, the infant mortality rate in 1965 was nearly 14%. What if we set it to zero?


At this point, we don’t have any real data, so well have to build a mathematical model, computing the number of births and deaths over time according to some formula. This is a pretty simple task that can be done in a few lines of Python. So, what numbers should we use?

For the birthrate, we’ll use the same eight children per women, but when are they born? Let’s suppose all women have a baby every two years from age 18 to 30. That’s unusual, historically, but plausible.

How about the death rate? To maximize the youth population, we’ll say that no one under the age of 21 dies. Then, to make the decline in the adult population as steep as possible, we’ll say that a fixed percentage of adults die every year regardless of age. This makes the survival rate an exponential decay function. (And to keep things simple, we’ll artificially cut off the population at age 80.) This is typical of many small animals, but more importantly, it’s also plausible historically for humans living in particularly dangerous or unhealthy environments. The reciprocal of this death rate is the life expectancy at age 21. Plug these numbers in Python, and you’re off to the races.

Can you get a youth fraction of 82.4% with these numbers? Yes…if you set the life expectancy at age 21 at about 10 years—in other words, life expectancy at birth is 31.

Well, that’s pretty short, but it’s not historically unheard of, you may say. Wasn’t the life expectancy in Ancient Rome about 25?

No! Well, technically yes, but no.

One of the biggest misconceptions about life expectancy is that it’s a measure of how long most people live. Today, that’s close to true, but in the past it’s very much not. When we say “life expectancy,” we usually mean life expectancy at birth, and because it’s an average, it can be skewed by infant mortality.

When we say that the life expectancy in Ancient Rome was 25 years, it certainly doesn’t mean that people were physically old at age 25. It doesn’t even mean that people were in conspicuously poor health at age 25, or that they were likely to die of disease at age 25. Living in close quarters in unsanitary conditions (by modern standards) and surrounded by lead pipes made those problems worse, but not that much worse.

Instead, a life expectancy of 25 means that half of all babies died before their first birthday, and the other half mostly grew up and lived to be 50 or 60. To get a measure of how long most adults live, you need to look at a different statistic, usually life expectancy at age 15.

In Rwanda in 1965, life expectancy at age 15 was 45 years (that is age 60). Thus, life expectancy at age 21 was 39 years. Plug that number into our model, and you get a youth fraction of 74%—not even equal to Logan’s Run’s first number.

Now, the results actually depend quite a lot on what age women have children. Obviously, with the death rates we’ve set, the number of potential mothers declines after age 21. What if girls start having children at age 16 and have a baby every year (still 8 in total)? If you put in those ages, the results are almost right on the money: a youth fraction of 82.7%.

But that is not a normal historical birth pattern for multiple reasons that you can probably guess. The bottom line is that it’s possible to reach a youth fraction of 82.4%, but it requires extremely high birth rates, very low infant mortality, and either an unnaturally young demographic of mothers or an unnaturally short life expectancy, which are demographic forces that do not occur together in any kind of normal circumstance.

And that’s not even getting into the fact that Rwanda in 1965 was not the same as the United States in 1965. In fact, the gap between the two was quite a bit wider then. It was more like the United States in 1800. There’s no way you could get those numbers worldwide except maybe after the Black Death or something, so I’m going to fall back on it being strictly artistic license.

About Alex R. Howe

I'm a full-time astrophysicist and a part-time science fiction writer.
This entry was posted in Science, Science Fiction and tagged , , . Bookmark the permalink.

1 Response to Do the Demographics of Logan’s Run Make Sense?

  1. Pingback: #24 – The New Dystopias | Science Meets Fiction

Comments are closed.