Viewpoint: Moore’s law isn’t broken - it’s overheated

Nick Harris, CEO and co-founder of US photonics computing specialist Lightmatter explains how advances in photonic computing technology could give Moore's Law a shot in the arm. 

Recent advancements in machine learning, computer vision, natural language processing, deep learning and more are already impacting life and humanity in ways seen and often unseen. This is especially true as it relates to artificial intelligence (AI). The demands of AI are growing at a blistering rate. Training AI models today requires ultra-high performance computer chips, leading to what one might refer to as a ‘space race’ among top technology companies to build, acquire, or get exclusive access to the highest-performance chips as soon as they come to market.

The problem is the latest high-performance chips are being produced using yesterday’s design ideas and technology. This results in progressively smaller chips, that consume a similar amount of energy, leading to unsustainable levels of energy density, much of which is dissipated in the form of heat that then negatively impacts the performance of the chip and reduces the energy efficiency of the compute system overall. It is this “shrink = energy density increase” problem that is at the root of the divergence from Moore’s Law; a divergence which itself threatens the pace of AI innovation. An additional important but less talked about negative impact of the energy density problem is the effect that removing all of the wasted energy in the form of heat is having on the environment.

While many declare Moore's Law to be broken or no longer valid, in reality it’s not the law that is broken but rather a heat problem.

At its core, the definition of Moore’s Law—which is more an observation than a physical Law—is that the number of transistors on a microchip doubles around every two years. The real life manifestation of the Law for decades now has been increases in compute speed and capability, at the same time as a halving in cost every two years. That is the fuel that drives performance increases and miniaturization of electronic devices and the ubiquity of the internet from which humanity has benefited .

Lightmatter
Lightmatter is developing next generation processors that use light rather than electricity. Image: Lightmatter

At the same time, questions are being asked about the future of Moore’s Law as a number of critical compute intensive applications evolve at a faster rate than chip scaling. This discrepancy in rate and scale threatens the continued swift development of important and increasingly foundational pillars of innovations like AI. Since 2010 AI compute requirement has grown at 5X the rate of Moore’s Law, doubling approximately every 3.5 months. Given the growing number of life altering applications built on top of AI engines finding a solution to this performance scaling mismatch is high on every serious fabless and chip manufacturing company’s priority list. After all, training a single neural network using today’s most advanced chip generates emissions equivalent to that of five cars over their entire lifetime.

Crunch time: innovations in data centre technology

While many declare Moore's Law to be broken, or no longer valid, in reality it’s not the law that is broken but rather a heat problem. Heat created by the absence of sufficient energy reduction that was once provided for free at each new process node is the real problem. The high temperature generated inhibits the performance of the chip, the removal of which increases the running costs and environmental impact of increasingly large and populous datacenters around the globe.

A solution may finally be here, and it’s in the form of a new, proven paradigm shift in semiconductor chips—a new way to compute and a new way to connect chips using photonics, in conjunction with electronics. Using photons, instead of electrons to compute has given birth to ultra-low power, high performance photonic computers that outperform state of the art conventional devices by a factor of up to 100X at one-tenth of the power. When integrated with a new generation photonic communication fabric that enables heterogeneous chip-to-chip communication, rack-scale compute systems become possible. This innovative, yet proven, new processor and communications capability powered by photons—not electrons—is set to propel chip innovation and performance and provide the opportunity for innovation and AI compute performance to get back on track with Moore’s Law.

The shot in the arm that Moore’s Law needs - photonic computing

The belief from many sharp minds in the computing industry is that Moore’s Law is outdated or dead. This belief is largely driven by the seemingly insurmountable challenge of continuing to shrink transistors (both in their physical size and energy consumption), while also keeping spiraling energy density and associated chip temperatures under control. In the absence of a solution to this ‘root cause’ issue, other out-of-the-box creative - but brute force - solutions are being evaluated to solve the question of how to remove excess heat generated by racks full of hundreds of thousands of chips, operating in multiple thousands of data centers around the globe. One such solution to the temperature overload problem can be found in Microsoft’s deep-sea cloud project, which in 2015 placed a data center in a hermetically sealed, nitrogen-filled shipping container on the seabed off the coast of Scotland to take advantage of ‘cost free’ cooling by the cold waters in the North Sea. While the project has demonstrated some promising results, not least improved component failure rates largely attributed to the inert nitrogen environment, clearly moving all datacenters underwater cannot be considered to be a scalable solution to the looming deficit in compute performance and energy efficiency required to continue the pace of innovation in AI.

Lightmatter
Lightmatter recently introduced Lightmatter Passage, a wafer-scale, programmable photonic interconnect that allows arrays of heterogeneous chips to communicate with each other at unprecedented speeds. Image: Lightmatter

The world, and Moore’s Law, needs a highly successful computing option that doesn’t require a retrieval team, atypically calm seas and a choreographed dance of robots and winches that maneuver between the pontoons of a gantry barge. Forcing electrons around and between chips, generates heat, and the smaller the conductor and the chip, the higher the energy density and the worse that problem becomes. After a point -around 400 Watts - it becomes impossible to pull enough heat out of the billions of small transistors crammed together on the chip and so the performance benefit of shrinking and increasing transistor packing density is lost as transistors lie dormant to stay with thermal budgets. Photonic computing has surfaced as a solution that answers the call for increased compute speed, low energy density, and reduced chip heating.

A bright future for a revamped and reframed Moore’s Law

For Moore’s Law, it’s about connecting the reality of today’s technical constraints with the needs of tomorrow’s data-centric world. Generating less heat while still making chips smaller and faster is the real solution to keeping up with the world’s need for the rate of increase compute performance.

Currently, huge heat sinks with high performance system fans, water-cooling, and Microsoft’s ocean floor cloud project are among the best ideas being advanced to attack the problem. This focus on outlandish ways to cool the compute process is very costly and amounts to throwing away money. In reality these approaches simply address the symptom, not the cause.

The most cost effective answer to cooling things down is to not generate energy in the form of heat in the first place. That’s where the paradigm shift to photonic computing wins, as it provides a computing platform that accelerates AI growth to solve the world’s biggest challenges and also decelerates the environmental impact caused by overheating datacenters all over the world.

Photonic compute and communication is providing a path to getting chip scaling and performance back on track with Moore’s Law while simultaneously reducing the environmental impact of the datacenters needed to continue the pace of AI innovation. Photonic computing is set to provide the pick-me-up Moore’s Law needs, helping to solve the world’s biggest problems and supporting a safer environmental outcome for all.

Nick Harris is the CEO and co-founder of Boston photonics startup Lightmatter