Is Artificial Intelligence The New Productivity Paradox?
In the 1970s and 80s, business investment in computer technology were increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so strange that they called it the productivity paradox to underline their confusion.
Productivity growth would take off in the late 1990s, but then mysteriously drop again during the mid-aughts. At each juncture, experts would debate whether digital technology produced real value or if it was all merely a mirage. The debate would continue even as industry after industry was disrupted.
Today, that debate is over, but a new one is likely to begin over artificial intelligence. Much like in the early 1970s, we have increasing investment in a new technology, diminished productivity growth and “experts†predicting massive worker displacement . Yet now we have history and experience to guide us and can avoid making the same mistakes.
You Can’t Manage (Or Evaluate) What You Can’t Measure
The productivity paradox dumbfounded economists because it violated a basic principle of how a free market economy is supposed to work. If profit seeking businesses continue to make substantial investments, you expect to see a return. Yet with IT investment in the 70s and 80s, firms continued to increase their investment with negligible measurable benefit.
A paper by researchers at the University of Sheffield sheds some light on what happened. First, productivity measures were largely developed for an industrial economy, not an information economy. Second, the value of those investments, while substantial, were a small portion of total capital investment. Third, the aggregate productivity numbers didn’t reflect differences in management performance.
Consider a widget company in the 1970s that invested in IT to improve service so that it could ship out products in less time. That would improve its competitive position and increase customer satisfaction, but it wouldn’t produce any more widgets. So, from an economic point of view, it wouldn’t be a productive investment. Rival firms might then invest in similar systems to stay competitive but, again, widget production would stay flat.
So firms weren’t investing in IT to increase productivity, but to stay competitive. Perhaps even more importantly, investment in digital technology in the 70s and 80s was focused on supporting existing business models. It wasn’t until the late 90s that we began to see significant new business models being created.
The Greatest Value Comes From New Business Models—Not Cost Savings
Things began to change when firms began to see the possibilities to shift their approach. As Josh Sutton, CEO of Agorai, an AI marketplace, explained to me,  “The businesses that won in the digital age weren’t necessarily the ones who implemented systems the best, but those who took a ‘digital first’ mindset to imagine completely new business models.â€
He gives the example of the entertainment industry. Sure, digital technology revolutionized distribution, but merely putting your programming online is of limited value. The ones who are winning are reimagining storytelling and optimizing the experience for binge watching. That’s the real paradigm shift.
“One of the things that digital technology did was to focus companies on their customers,†Sutton continues. “When switching costs are greatly reduced, you have to make sure your customers are being really well served. Because so much friction was taken out of the system, value shifted to who could create the best experience.â€
So while many companies today are attempting to leverage AI to provide similar service more cheaply, the really smart players are exploring how AI can empower employees to provide a much better service or even to imagine something that never existed before. “AI will make it possible to put powerful intelligence tools in the hands of consumers, so that businesses can become collaborators and trusted advisors, rather than mere service providers,†Sutton says.
It Takes An Ecosystem To Drive Impact
Another aspect of digital technology in the 1970s and 80s was that it was largely made up of standalone systems. You could buy, say, a mainframe from IBM to automate back office systems or, later, Macintoshes or a PCs with some basic software to sit on employees desks, but that did little more than automate basic clerical tasks.
However, value creation began to explode in the mid-90s when the industry shifted from systems to ecosystems. Open source software, such as Apache and Linux, helped democratize development. Application developers began offering industry and process specific software and a whole cadre of systems integrators arose to design integrated systems for their customers.
We can see a similar process unfolding today in AI, as the industry shifts from one-size-fits-all systems like IBM’s Watson to a modular ecosystem of firms that provide data, hardware, software and applications. As the quality and specificity of the tools continues to increase, we can expect the impact of AI to increase as well.
In 1987, Robert Solow quipped that, “ You can see the computer age everywhere but in the productivity statistics,†and we’re at a similar point today. AI permeates our phones, smart speakers in our homes and, increasingly, the systems we use at work. However, we’ve yet to see a measurable economic impact from the technology. Much like in the 70s and 80s, productivity growth remains depressed. But the technology is still in its infancy.
We’re Just Getting Started
One of the most salient, but least discussed aspects of artificial intelligence is that it’s not an inherently digital technology. Applications like voice recognition and machine vision are, in fact, inherently analog. The fact that we use digital technology to execute machine learning algorithms is actually often a bottleneck.
Yet we can expect that to change over the next decade as new computing architectures, such as quantum computers and neuromorphic chips, rise to the fore. As these more powerful technologies replace silicon chips computing in ones and zeroes, value will shift from bits to atoms and artificial intelligence will be applied to the physical world.
“The digital technology revolutionized business processes, so it shouldn’t be a surprise that cognitive technologies are starting from the same place, but that’s not where they will end up. The real potential is driving processes that we can’t manage well today, such as in synthetic biology, materials science and other things in the physical world,†Agorai’s Sutton told me.
In 1987, when Solow made his famous quip, there was no consumer Internet, no World Wide Web and no social media. Artificial intelligence was largely science fiction. We’re at a similar point today, at the beginning of a new era. There’s still so much we don’t yet see, for the simple reason that so much has yet to happen.
Image: Pixabay
Wait! Before you go…
Choose how you want the latest innovation content delivered to you:
- Daily — RSS Feed — Email — Twitter — Facebook — Linkedin Today
- Weekly — Email Newsletter — Free Magazine — Linkedin Group
Greg Satell is a popular author, keynote speaker, and trusted adviser whose new book, Cascades: How to Create a Movement that Drives Transformational Change, will be published by McGraw-Hill in April, 2019. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com and follow him on Twitter @DigitalTonto.
NEVER MISS ANOTHER NEWSLETTER!
LATEST BLOGS
Three things you didn’t know about credit cards
Photo by Ales Nesetril on Unsplash Many of us use credit cards regularly. From using them for everyday purchases to…
Read MoreFive CV skills of a business-minded individual
Photo by Scott Graham on Unsplash The skills listed on a CV help employers quickly understand your suitability for a…
Read More