Last week, RIoT celebrated our 10th anniversary.  In this column, I discussed the past 10 years of IoT and data analytics, highlighting a number of technology tipping points that we experienced and what they mean for this region as we strive to be the Global Center of Excellence in the Data Economy.  You can read that piece here.

This week, I’d like to take a forward look at the most likely areas of significant advancement in the next 10 years.

First, let me define “data economy.”  This is the concept that every industry, in the future, will rely heavily on the real-time automation of data.  No longer will it be good enough to simply use information, accessible via the internet to run our businesses and enjoy our daily activities. We will instead rely on continuous, low-latency capture, analysis and response to data.

Here are a few examples of data automation that will become common.

●     Autonomous cargo and human transit – Self-navigating cars, drones and robots collect data from the environment around them to safely navigate.

●     Persistent health – Wearable and implantable sensors never stop monitoring health biomarkers. This shifts us away from treating illness (reactive) and towards maintaining health (predictive and prescriptive).

●     Optimized energy management – We shift from today’s unidirectional energy generation and transmission to a mesh network that stores and transmits energy similarly to how the internet stores and transmits data packets. Our cars and homes become part of this energy system, receiving, storing and delivering energy to meet network needs in real time.

●     Systems of systems – Disparate data sets are fused to create greater benefit across systems. For example, dense stormwater monitoring networks fuse with traffic management systems, automatically rerouting traffic away from flash flood points before the water arrives.

The central driver of the data economy will remain our ability to compute data with higher and higher complexity, but at lower and lower cost.  In the last 10 years, we have seen the most powerful microchips increase 40-fold in power. Last week, NVIDIA announced a new chip with 206 billion transistors.  In 2014, their most powerful chip had just 5 billion.

There are two significant challenges to overcome, both related to basic physics.  One relates to size and the other to energy.

Years of nanotechnology research have yielded commercially viable methods to manufacture smaller and smaller transistor node sizes, so we can fit more and more transistors onto every microchip.  But we are reaching a point where each node is approaching the size of a single atom. While physicists have discovered particles that are smaller than atoms, we are reaching a size limit for “traditional” transistors.

The second challenge is that the activation of trillions upon trillions of transistors generates a ton of heat. The secretary of energy for the state of Virginia recently shared that fully 1/3 of all energy generated in Virginia in 2023 was consumed by data centers, each requiring massive amounts of water to keep them cool. A single training run for a typical large language model like Chat-GPT uses the equivalent energy of powering more than 20 homes for an entire year. The investment in new data centers is outpacing investment in energy generation. At the same time, water is becoming more scarce. Traditional data centers are becoming a significant hurdle in our ability to combat climate change (even as they help us analyze and seek solutions).

The most promising solution to both the size and energy barriers is quantum computing. Quantum computers have massively stronger computing power than traditional silicon transistor systems.

Consider an oversimplified comparison.  A binary computer with just 20 transistors, could run calculations to put it in any one of 220 different solution-states (1,048,576).  An equivalent quantum computer could be in every single one of those computation states at the same time.

Both IBM and Atom Computing have already built quantum computers with more than 1,000 nodes (called qubits), and dozens of companies around the world are already using quantum computers for practical applications like financial modeling and supply chain optimization.

Quantum computing is likely to be the biggest story of the next decade, bringing sufficient analytic power to analyze extremely complex systems and use cases.

I think the next most important shift will be towards augmented reality.

Humans have always embraced technology-based augmentation. We advance textiles and other material technologies to create increasingly functional clothing, augmenting our own bodies’ ability to heat, cool and protect ourselves. We actively use electronics to augment our hearing (hearing aids) and our hearts (pacemakers). We are adding motors, sensors and automation to prosthetics, exoskeletons, braces and wheelchairs.

One of the oldest and most accepted technology augmentations is the simple eyeglass and contact lens. We have the capability to now add digital information directly into our field of view. Applications today are somewhat gimmicky, due to limitations in battery life, computing capability and heat dissipation in a small form factor. But we are quickly solving those challenges.

Once AR becomes mainstream, I believe that we will never go back to simply looking at the world without augmenting our views. Vision “clarity” will be more than just 20/20 optics.  It will also be never forgetting a face via name reminder prompts. We will never look away from assembling IKEA furniture to look at instructions because instructions will overlay on what you’re building. We will see when the food we are cooking in the frying pan has reached the right temperature and is safe to eat. Our entertainment and education will become richer and will no longer be tied to the television or the classroom. We will take digital breaks, of course, but the idea of not using AR in our daily lives will seem quaint to future generations.

The next 10 years will bring significant advances in curing disease, improving crop yields, and personalizing medicine.  All of these are driven by our capability to analyze data sets at unprecedented scale. Every single industry will automate data to improve our quality of life.

The future is bright and I’m bullish on the data economy.  I would love to hear what technologies and use cases you anticipate will “tip” in the next 10 years.