The Next Great Transformation Will Be from Bits to Atoms
When engineers from Xerox PARC showed off their revolutionary new personal computer, the Alto, at the company’s global conference in 1977, senior executives weren’t particularly impressed. It just didn’t seem to be relevant to their jobs or their business. Their wives, however, were transfixed.
The reason for the disparity was that the executives saw a tool to automate secretarial work, which they considered to be a low value activity. The wives — many of whom had been secretaries — saw an entirely new world of possibility and, when Steve Jobs built the Macintosh based on the Alto, everyone else saw it too.
It’s easy to shake our heads and laugh at those shortsighted executives of the past, but we’d do ourselves a much greater service by realizing that we are not that different. The truth is that the next big thing always starts out looking like nothing at all, so it’s hard to grasp its implications early on. That’s essentially where we are today with the shift from bits to atoms.
Anatomy Of A Revolution
These days we consider personal computers to be revolutionary, but as a stand-alone technology they were fairly limited. The original Macintosh was incredibly slow by today’s standards and only had 400 KB of storage. It wasn’t easily connected to other computers, which made it useless for sharing information.
Over time, that would change. Complementary technologies, such as the relational database, which led to ERP software, as well as the Ethernet, which connected computers together and eventually the Internet, made the information age possible. As all of these technologies became vastly more powerful, the world was significantly transformed.
One of the most overlooked aspects of computing technology is how it made it possible to do simulations. Once computers were hooked up to massive databases, information could be downloaded and analyzed in spreadsheets. Executives could use that information to create different scenarios based on real-world data and apply those insights to make decisions.
None of this was obvious to anyone in 1977. In fact, these aspects of the technology wouldn’t become clear until the late 90s — a full two decades later. What the Xerox executives saw at the conference couldn’t have significantly helped them do their jobs, so it shouldn’t be surprising that they didn’t see what the big deal was.
The End Of Moore’s Law And The Rise Of New Computing Architectures
Computers have become so ubiquitous in the world today that it’s easy to miss something extraordinary going on. After decades of continuous improvement, our machines aren’t getting any better. Buy a laptop today and it’s likely to have nearly identical specifications to one you bought five years ago.
There are two reasons for this. First, the chip technology itself is nearing theoretical limits, so basic advancement is slowing down. Second, because computationally intensive tasks can be done more cheaply and conveniently in the cloud, we don’t have any great need for vastly more computing power on our desks or in our pockets.
Amid this slowdown of legacy technology, new revolutionary computing architectures are emerging. The first, called quantum computing can handle almost unimaginable complexity. The second, neuromorphic chips, can recognize patterns much more efficiently than conventional architectures and use far less power.
One indication of what’s at stake is how many top firms are investing in these technologies. Google and IBM have very advanced quantum programs, but others such as Microsoft and Intel and startups like Rigetti and D-Wave are also progressing fast. IBM, Intel, Qualcomm, Nvidia all have advanced neuromorphic programs.
An Emerging Physical Stack
When most people think about digital technology, they usually only think about the top layer, the device and the user interface, but that is just a small fraction of the whole. There is an entire stack of technologies, from databases to middleware to applications that go into making it all work.
Today, a similar stack is being built for the physical world. New databases, such as The Cancer Genome Atlas and the Materials Genome Initiative, catalogue specific aspects of the physical world. These, in turn, are analyzed by powerful machine learning algorithms. The revolution underway is so profound that it’s reshaping the scientific method.
In the years to come, the new, more powerful computing architectures will drive the physical stack. Simulating chemistry is one of the first applications being explored for quantum computers, which will help us build larger and more detailed databases. Neuromorphic technology will allow us to analyze complex patterns and derive new insights.
The way we interface with the physical world is changing as well. Nanotechnology allows us to manipulate materials on a molecular scale, while new techniques such as CRISPR helps us edit genes at will. Virtual reality will help us internalize insights and advanced manufacturing techniques, such as 3D printing will bring these visions into reality.
The Great Transformation
Innovation is never a single event, but a process of discovery, engineering and transformation and we almost always underestimate the complexity and duration of the transformation stage. Douglas Engelbart presented the basic features of personal computers in 1968, but the economic impact didn’t hit until the late 1990s. Edison completed the first power station in 1882, but electricity didn’t begin transforming our lives until the 1920s.
There are two reasons transformation takes so long. The first is that complementary technologies need to emerge. We get little out of computers without applications and electricity is of little use without machines designed to use it. Second, we need to redesign our organizations, work practices and lifestyles in order to get the most out of new technology.
On average, it takes about 30 years to go from initial discovery to significant market impact and we are about a decade into the next great transformation. That puts us almost exactly where those Xerox executives were in 1977. They had no idea of what personal computers would unleash and, if we’re honest, we need to admit that we are in the same boat.
What we can do is recognize that there is a great transformation underway that will unlock possibilities and opportunities that are impossible to see clearly right now. However, it’s more important to explore than to predict and that’s what we need to do today. We don’t need to understand the future to be open to it.
An earlier version of this article first appeared in Inc.com
Wait! Before you go…
Choose how you want the latest innovation content delivered to you:
- Daily — RSS Feed — Email — Twitter — Facebook — Linkedin Today
- Weekly — Email Newsletter — Free Magazine — Linkedin Group
Greg Satell is a popular author, speaker, and innovation adviser who has managed market-leading businesses and overseen the development of dozens of pathbreaking products. Follow Greg on Twitter @DigitalTonto. His first book, Mapping Innovation, was selected as one of the best business books of 2017 by 800-CEO-READ.
NEVER MISS ANOTHER NEWSLETTER!
LATEST BLOGS
Credit Card Shenanigans
It must be great to be in the credit card business in the United States. Demand is relatively inelastic and regulation is lax, so you can charge whatever you want for an interest rate, increase your fees once or twice a year, and make additional money off cash withdrawals and foreign exchange transactions.
Read MoreBuilding an Experience
As people become ever more immune to traditional advertising and marketing, branding will become more important. Branding is all about building an emotional connection with customers. Making the decision to follow a strategy focused on building a brand is not without peril, however, as it means that you will have to choose to not do certain things, like pursue a low price strategy.
Read More- « Previous
- 1
- …
- 4,132
- 4,133
- 4,134