What if the future looks exactly like the past?

Aug 26, 2025 - 10:22
 0  0
What if the future looks exactly like the past?

When Peter Drucker first met IBM CEO Thomas J. Watson in the 1930s, the legendary management thinker and journalist was somewhat baffled. “He began talking about something called data processing,” Drucker recalled, “and it made absolutely no sense to me. I took it back and told my editor, and he said that Watson was a nut, and threw the interview away.”

Things that change the world always arrive out of context for the simple reason that the world hasn’t changed yet. So we always struggle to see how things will look in the future. Visionaries compete for our attention, arguing for their theory of how things will fit together and impact our lives. Billions of dollars are bet on competing claims. 

This is especially true today, with artificial intelligence making head-spinning advances. But we also need to ask: What if the future looked exactly like the past? Certainly, there’s been no lack of innovation since Drucker met Watson. How did those technologies impact the economy and shape our lives? If we want to know what to expect from the future, that’s where we should start. 

The First Productivity Paradox

Thomas J. Watson [Photo: IBM]

Watson would, of course, build IBM into an industrial giant. But it was his son Thomas Watson Jr. who would transform the industry in 1964 with the $5 billion gamble (nearly $50 billion in today’s dollars) on the System/360, a platform that would dominate the computing world for two decades. It was, essentially, the Apple iPhone and Microsoft Windows of its time, combined. 

Just as the elder Watson had foreseen, data processing became central to how industry functioned. In the 1970s and ’80s, business investment in computer technology was increasing by more than 20% per year. Yet, strangely, productivity growth was falling. Economists coined the term “productivity paradox” to describe this strange contradiction. 

The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit-seeking businesses continue to make substantial investments, you’d expect to see a return. Yet with IT investment in the ’70s and ’80s, firms continued to increase their investment with negligible measurable benefit.

A paper by researchers at the University of Sheffield in England sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, was a small portion of total capital investment. Third, businesses weren’t necessarily investing to improve productivity, but to survive in a more demanding marketplace.

By the late 1990s, however, that began to change. Increased computing power, combined with the rise of the internet, triggered a new productivity boom. Many economists hailed a “new economy” of increasing returns, in which the old rules no longer applied. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for the technology to hit critical mass and deliver us to the promised land. 

The Second Productivity Paradox

By the turn of the century, the digital economy was going full steam. While old industrial companies like Exxon Mobil, General Motors, and Walmart still topped the Fortune 500, new economy upstarts like Google, Apple, and Amazon were growing quickly and, after a brief dotcom bust, would challenge the incumbents for dominance. 

By 2004, things were humming again. Social media was ramping up, and Tim O’Reilly proclaimed the new era of Web 2.0. A few years later, Apple launched the iPhone and that, combined with the new 4G standard, ushered in the era of mobile internet. New cloud computing services such as Amazon Web Services and Microsoft Azure would make vast computing power available to anyone with a credit card. 

Yet as economist Robert Gordon has pointed out, by 2006 it had become clear that productivity was slumping again and, despite some blips here and there, it hasn’t recovered since. For all of the hype coming out of Silicon Valley, we’ve spent the past 20 years in the midst of a second productivity paradox. 

Clearly, things have qualitatively changed over the past two decades. We are no longer tethered to our desks at work. A teenager with a smartphone in a developing country has more access to information today than a  professional working at a major institution did back then. It is, to paraphrase Robert Solow’s famous quip, as if we can see the digital age everywhere but in the productivity statistics. 

Searching For Utopia . . . And Finding So-So Technologies

Business pundits claim that things have never moved faster, but the evidence shows exactly the opposite. In fact, we’ve been in a productivity slump for over half a century. Data also shows that industries have become more concentrated, not more competitive, over the past 25 years. U.S. corporate profits have roughly tripled as a percentage of GDP in that same time period.

So what gives? The techno-optimists keep promising us some sort of utopia, with a hypercompetitive marketplace yielding productivity gains so vast that our lives will be utterly transformed for the better. But the data says otherwise. How do we reconcile the visions of the Silicon Valley crowd with the hard analysis of the economists?

Some of the same factors behind the first productivity paradox are still at play. According to Statista, the digital economy makes up only about 9% of GDP. An analysis by the Federal Reserve Bank found that while AI is having a huge impact on some tasks, such as computing and math, it’s not having much of an effect at all on things like personal services, office and administration work, and blue-collar labor. 

Part of the answer may also lie in what economists Daron Acemoglu and Pascual Restrepo call so-so technologies, such as self-checkouts in supermarkets, screen ordering at airport bars, and automated customer service systems. These produce meager productivity gains and often put a greater burden on the consumer. 

The simple truth is that our economy is vast, and digital technology plays only a limited role in most of it. Next time you’re checking your smartphone in traffic, ask yourself: Is your chatbot making your rent any cheaper? Is it getting you through traffic any faster? Or making your trip to the doctor any less expensive?

Innovation Should Serve People, Not The Other Way Around

In his 1954 essay, “The Question Concerning Technology,” German philosopher Martin Heidegger described technology as akin to art, in that it reveals truths about the nature of the world, brings them forth, and puts them to some specific use. In the process, human nature and its capacity for good and evil are also revealed.

He offers the example of a hydroelectric dam, which uncovers a river’s energy and channels it into electricity. In much the same sense, the breakthrough technologies of today—like the large language models that power our AI chatbots, the forces of entanglement and superposition that drive quantum computing, as well as technologies like CRISPR and mRNA that fuel tomorrow’s miracle cures—were not “built,” so much as they were revealed.

In another essay, “Building Dwelling Thinking,” Heidegger explains that what we build for the world depends on how we interpret what it means to live in it. The relationship is, of course, reflexive. What we build depends on how we wish to dwell, and that act, in and of itself, shapes how we build further.

As we go through yet another hype cycle, we need to keep in mind that we’re not just building for the future, but also for the present, which will look very much like the past. While it is, of course, possible that we are on the brink of some utopian age in which we unlock so much prosperity that drudgery, poverty, and pain become distant memories, the most likely scenario is that most people will continue to struggle.

The truth is that innovation should serve people, not the other way around. To truly build for the world, you need to understand something about how people live in it. Breakthrough innovation happens when people who understand technical solutions are able to collaborate with people who understand real-world problems. Just like in the past, that’s what we need more of now. 

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0