It’s Not in Your Head: The World Really Is Getting Worse

Since the 1970s, wages, infrastructure, and the pace of technology have all stagnated. Can it be reversed?

Image of the Earth cracked in half
The Walrus/Public Domain Pictures

On May 30, 2020, a rocket lifted off from the famous Launch Complex 39A at NASA’s Kennedy Space Center, in Florida. Was it a manned mission to Mars? The testing of a completely new form of propulsion? The first step toward what would become a permanent base on the moon?

Not quite. A NASA administrator marked the moment with some solemn words: “Today, a new era in human spaceflight begins as we once again launch American astronauts on American rockets from American soil on their way to the International Space Station.”

American rockets have been putting American astronauts into space since the 1960s. The ISS was launched in 1998 and has been continuously inhabited for over twenty years. What made this event groundbreaking was that the rocket was built by SpaceX, making it the first time a private company launched American astronauts into orbit. And, while NASA didn’t dwell on it, it was also the first time Americans had put Americans into space since the Space Shuttle program was shut down, in 2011. The US had been relying on rides from the Russians ever since.

In 1962, when president John F. Kennedy was looking for something to establish America’s undeniable superiority over the Soviets in order to underscore the virtues of his country’s economy and ingenuity, his advisers suggested landing a man on the moon. It was a stretch goal for sure, a difficult and hugely expensive undertaking. But Kennedy got congressional approval for the funds to land a man on the moon before the decade was out. In many ways, Kennedy’s argument for going to the moon was just a rocket-fuelled version of George Mallory’s quip about why he wanted to climb Mount Everest: “Because it’s there.” The moon mission also had a more direct geopolitical dimension, which Kennedy made clear in his speech to Congress. Space was just one very public front in the ongoing battle between freedom and tyranny.

“We go to space,” Kennedy told Congress, “because whatever mankind must undertake, free men must share.” He went on to demand that Congress either commit fully to the moon or not bother committing at all. It would be better not to try than to start and then back away. And it wasn’t just Congress that had to commit. The US would certainly fail unless every person involved—every scientist, engineer, contractor, and public servant—gave “his personal pledge that this nation will move forward, with the full speed of freedom, in the exciting adventure of space.”

The full speed of freedom. In that simple and remarkable phrase, Kennedy brought together a number of tightly connected ideas that mortared the fundamental relationship between democracy, capitalism, and innovation. It wasn’t really about the moon—as Kennedy freely admitted, he didn’t actually care about space as such. It was just that doing hard things and solving hard problems are what free people do. They don’t submit to authority; they commit to collective action. In doing so, they unleash “the full speed of freedom.”

Sixty years after Kennedy’s speech to Congress, you’d be hard pressed to find anyone willing to make the case that accomplishing hard tasks, solving hard problems, and committing to collective action are particular ambitions or ideals or strengths of American democracy.

So the return to space with SpaceX is good news by one measure. But, by most other measures, it marks a rather surprising stagnation. Fifty years before the SpaceX success, Apollo 11 and Apollo 12 had already landed astronauts on the moon, and the crew of Apollo 13 had been brought safely back to Earth after an oxygen tank exploded, threatening their lives. But no human has been out of low Earth orbit since 1972, and NASA has gone nearly a decade without putting anyone into any orbit at all.

It’s tempting to think of this stagnation as a specific feature of the space race, but the pattern holds more generally. Compared to the amazing pace of invention and discovery that was the norm from the late 1700s until the first half of the twentieth century, the last fifty years have been a bit of a snooze.

To give a quick run-through: The period from 1770 to 1820 saw the invention of the cotton gin, the electric battery, the steam locomotive, and the Watt engine. If you entered your fifty-year cryogenic sleep in 1820, you’d have woken up to a planet transformed by cement, the telegraph, the typewriter, cameras, bicycles, antiseptics, pasteurization, and dynamite. If you fell back asleep for another half century, you’d wake up to an entirely different world, filled with telephones, movies, electricity, cars, airplanes, machine guns, air conditioners, vacuums, radio, and radar. And, if you embarked on another fifty-year sleep in 1920, you’d miss the development of jet planes, space flight, lunar landings, nuclear weapons, nuclear power, penicillin, electric guitars, VCRs, computers, video games, the internet, and just about any other mod con you can imagine.

For 200 years, every half century of sleep would have you waking up to a new age of miracle and wonder: new technologies; stunning advances in health, wealth, and comfort; amazing new products and consumer goods; and a world steadily shrinking thanks to new forms of transportation and communication. Politically, things would keep changing enormously as well, the age of monarchy, colonialism, and empire giving way to a world order focused on nation states and led by an ever-growing alliance of liberal democracies.

And then, in 2020, you’d wake up maybe a little disappointed. At first, it might seem that economic growth had once again worked its magic: in particular, in the way everyone was carrying the world’s entire cultural inheritance in their pockets. But that would just be the logic of networked computing (which already existed) playing itself out. Aside from that, the world would look, in many respects, like things had stalled or even gone backward.

It wouldn’t be just an illusion. Since the 1970s, real wages have seen little to no growth, especially for middle- and low-income earners, while public goods like education and health care are more expensive than ever. Our infrastructure is crumbling, traffic congestion gets worse and worse, our airports are decrepit, and the trains almost never run on time. Nuclear energy, once hailed as the energy of the future, has been a flop. Domestically, there have been huge advances in consumer electronics, like enormous flat-screen televisions and AI-driven sound systems that will play any song you like, all you have to do is ask. But, in the realm of the kitchen or the laundry room, there has been a reversal. Dishwashers are slower today than they were forty years ago, hold fewer dishes, and don’t get them as clean. Ditto for clothes washers. We’re spinning our wheels and have been for a few decades now.

Narratives that see widespread economic and technological stagnation setting in sometime around the mid-1970s have become fairly common. And it’s pretty clear that the 1970s were some sort of inflection point, a time when we fell off established income, innovation, and progress curves on a number of fronts. The big question is why.

What economists call “secular stagnation” has been around since the 1930s, when it described the idea that a combination of an aging population, low rates of immigration, and the exhaustion of technological progress would lead to an imbalance between excess household savings and inadequate business investment. The result would be an extended period of little to no economic growth. The theory got new traction in the aftermath of the 2008 economic crisis thanks to Harvard economist Lawrence Summers. As he saw it, the main reason the economy struggled to recover from the Great Recession was secular stagnation: an increased propensity to save and a decreased propensity to invest, leading to “shortfalls in demand and stunted growth.” To get the economy rolling again, he prescribed a form of Keynesianism: governments just needed to spend a lot of money to help spur demand.

But there’s a version of the stagnation thesis that’s a bit more complicated. In 2011, economist Tyler Cowen wrote a very influential essay called “The Great Stagnation,” in which he traced the economic crisis to something else—the result of basic structural problems with our social and economic systems.

According to Cowen, the story began back in the 1970s, when median wages effectively stopped rising, and carried on through the first decade of the new century, which saw virtually no net job creation. Through all this, we maintained the illusion of growing prosperity thanks to increasing household debt and inflated home prices. But, he writes, “all of these problems have a single, little-noticed root cause . . . . We have been living off low-hanging fruit for at least 300 years. We have built social and economic institutions on the expectation of a lot of low-hanging fruit, but that fruit is mostly gone.”

What did this bounty consist of? Cowen suggests three possibilities: the benefits of “free land” (which he concedes was largely stolen from the original occupants), the technological breakthroughs of the Industrial Revolution, and the enormous pool of smart but uneducated people who gradually moved off the farm, into cities, and got educated. Other possible factors include access to cheap and abundant energy sources, especially oil; the expansion of democracy and liberal values; and the slow but steady emancipation of women and their incorporation into the workforce. We have exploited each of these in turn, and each has given us enormous economic gains at very low cost.

Most notably, the Industrial Revolution that transformed Europe in the late eighteenth century led to an explosion of technological breakthroughs a century later. While the telegraph, railroads, and steam shipping were already around, the period between 1880 and 1940 brought us indoor plumbing, electric lighting, cars, airplanes, the telephone, radio, pharmaceuticals, and plenty of other innovations. Each of these things is near miraculous on its own, but the real benefits came from combining them: complicated machines with the cheap and abundant energy produced by fossil fuels or electricity, say. Add in fast and increasingly cheap communications, from the telegraph to the radio to the telephone, and ideas can spread quicker than ever before. The result was the rapid expansion of civilization and progress.

All of this amounts to a pretty stiff counter to the naive Enlightenment position that progress occurred when we emerged from our intellectual infancy. According to the stagnation thesis, what we’d thought of as “progress” wasn’t really a ladder that we climbed to a more permanent plateau of development. It was more like an oasis that we stumbled upon after wandering around a desert for millennia.

Still, Cowen remains fairly optimistic. He thinks the situation will probably sort itself out in a couple of decades or so, once we figure out how to realize significant productivity gains from sources like biotechnology and the internet. There’s a more pessimistic version of the argument, though, that says the root of the problem isn’t just the exhaustion of natural resources or us hitting up against some technological barriers. Instead, the causes of the Great Stagnation are fundamentally political and cultural—and those forces holding back progress are getting more entrenched, not less.

In a self-published 2017 book titled Where’s My Flying Car?, computer scientist and nanotech futurist J. Storrs Hall argues that the Great Stagnation is real but that it isn’t what most people think. The real enemy of progress isn’t technological limits but political interference and cultural hostility to science.

The spine of his argument is the story of the rise and fall of the flying car. As Hall argues, there was a healthy research and development industry around the flying car as early as the 1920s. In the natural evolution of things, we should have had affordable, consumer-focused flying cars by the 1970s or ’80s. What stopped it was a combination of the bureaucratization of science research, the rise of the regulatory state, and the baleful influence of the Luddite wing of the counterculture.

Hall’s book has become something of a cult classic among engineers, economists, and Silicon Valley types. But the story Hall tells about the flying car is just a microcosmic explanation for the more general technological stagnation of the past fifty years. As Hall points out, the Great Stagnation arrived in lockstep with the explosion in PhDs and the large-scale takeover of research and development funding by the state. The result is what he calls “the Machiavelli effect,” where centralized funding creates an intellectual elite of political insiders who gain control of a field. This elite has a vested interest in the status quo, which it preserves by controlling access to funding and manipulating the regulatory process to which it has privileged access. For example, he notes that, when Bill Clinton’s US government launched a nanotechnology initiative that redirected existing funding streams, the affected researchers’ response was to redescribe whatever they were already doing as “nanotech.” It’s a pattern anyone familiar with the Canadian government’s recent investment in AI research will recognize.

This sort of turf protection isn’t restricted to science and technology. It’s a standard feature of almost any government-funded bureaucracy, including in arts and culture. For anyone who works in Canada’s culture industry, the Machiavelli effect is a perfect description of what it’s like to navigate the funding gatekeepers. The point, which Hall is careful to emphasize, is that there’s no conspiracy at work here, just the entirely predictable protection of the interests of people who’ve done very well under an existing system.

To grasp the force of Hall’s central claim, you don’t need a graph showing how the Great Stagnation happened to coincide with a sharp rise in the number of PhDs awarded each year. All you have to do is live in North America in the twenty-first century, where the basic elements of Hall’s story—progress and innovation being stymied by political forces—are staring us in the face. Canada in particular has become notorious as a place where it’s almost impossible for governments, or even the private sector, to do anything. Francis Fukuyama, who has in recent years turned his attention to the question of institutional decay and political decline, coined the term “vetocracy” to describe the system of entrenched political interests that have made it very hard to get anything built or done in America.

So forget flying cars. Why is there no nanotech industry to speak of? Why was the nuclear power industry effectively smothered in the 1970s? Why did the market for general aviation aircraft fall off a cliff in the early 1980s? Why has “cost disease” afflicted so many industries, from education to health care to construction? Why does a kilometre of subway in New York or London or Toronto cost more than double (in constant dollars) what it cost in the sixties and seventies?

The answers to all of these questions have less to do with inherent technological limits and more to do with decaying infrastructure, politics, bureaucracy, and regulation. As Hall puts it, “the trees of knowledge are growing higher than ever, but someone appears to have been spraying paraquat on the low-hanging fruit.” The Great Stagnation is more like a great strangulation. To vary the metaphor, it’s like we stumbled out of the desert into a great buffet, ate all the food, and then spent the next forty years tying the chefs up with ever-more-stringent rules over what they could cook, under what conditions, and whom they could serve it to.

Adapted from On Decline. Copyright © Andrew Potter, 2021. Published by Biblioasis. Reprinted by permission of the publisher.

Andrew Potter
Andrew Potter is an associate professor (professional) at the Max Bell School of Public Policy.