BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

IBM's Memory Chip: A New Era Or Another Noble Failure?

This article is more than 7 years old.

IBM this month unveiled a memory technology that could radically reshape computing as well as emerging markets like the Internet of Things.

And who knows—this time it might come true.

To recap, IBM Zurich showed off a prototype of a phase change memory device that IBM says can last one million cycles, hold tremendous amounts of data and deliver it in a far faster manner than existing technologies. With devices built on this stuff, you could  dramatically cut power consumption in data centers or unobtrusive medical devices for monitoring all of your vital signs.

The big question now is whether it will take over the world or fizzle out.

On one hand, phase change memory is a phenomenal idea. The performance characteristics give it a chance to displace hard drives and flash memory as a storage technology. Even more important, it could replace DRAM, the expensive, yet fast, technology to hold data in close proximity to processors.

DRAM is a $33 plus billion dollar market in deep trouble. Chip makers are increasingly finding it expensive and difficult to push the boundaries of Moore’s Law and shrink transistors. If transistors can’t continue to be shrunk, the steady improvements in (lower) cost and (better) performance we have come to expect in computing stall. To get around these challenges, flash memory makers (and likely makers of other devices like FPGAs) are moving toward 3D architectures where circuits are stacked on top of each other: same cost and performance benefits of Moore’s Law but by a different means.

DRAM can’t go 3D, according to researchers I’ve talked to. Phase change memory, therefore, could pass it up over the next decade, just as the cost/performance benefits of flash memory shoved hard drives out of notebooks and are displacing them in data centers. IBM's chip is not 3D, but it claims it can potentially win the price/performance/yield  metrics. (Disclosure: earlier I stated that IBM's chip was 3D. It's not, but it doesn't change the thrust of the story.)

On the other hand, IBM has a long history of coming out with futuristic, whiz-bang ideas for the semiconductor market that never seem to get out of the lab. Back in the 2000s, the company promoted something called Racetrack Memory. It’s still in development. There was also Millipede, MRAM and STT-RAM, Sprintronics: all great ideas that never took the world by storm. (IBM has also been involved in the nearly now two decades quest to bring extreme ultraviolet lithography to a fab near you.)

What’s the problem? Memory and storage remain one of the most challenging markets in the world. New memory ideas require monumental advances in material science. Billions are required for R&D and new fabrication facilities.

And yet, in the end, you might only break even because competition can drive prices to the floor rapidly. Analyst Jim Handy once estimated that no companies actually made money on flash memory the first two decades flash was around. There have been over 200 hard drive manufacturers. Now there are only three ( Western Digital , Seagate and Toshiba .)

It’s more like a bad gambling habit than a business.

Phase-change memory in particular is notoriously just off the horizon. Gordon Moorein a 1970 essay—predicted it could be commercially produced within a decade. In the 90s and 2000s, Intel and Micron invested billions in a phase change company called Numonyx that has all but been neutered. Philips and others came up with phase change designs. After over 45 years, phase change memory has been used only in a few pachinko machines.

Intel’s XPoint technology, due over the next few years, is phase-change like and promises to become the first widely deployed type of its technology. (IBM’s design, by the way, involves putting three bits of data into a phase change cell. Intel is on two bits, but going to three has been expected.)

So IBM is developing a technology that will involve tremendous breakthroughs and ultimately sell for commodity prices. Wait, it gets worse. IBM also won’t make the technology.  It wants to license it to other manufacturers. Samsung and Hynix, two of the biggest DRAM makers in the world, clearly know they need a roadmap to the future.

IBM could have their answer. Chip makers, however, are notoriously wary about licensing deals. Who wants to pay royalties on a product that will get sold for nearly at cost, after all? IBM would make all the profits and take the least risk. ARM has succeeded as an intellectual property firm because it sells a complete processor design that would otherwise be difficult to replicate. (ARM also prides itself on good customer services. Other IP firms like Rambus and Tessera found themselves embroiled in court disputes.

IBM right now has a licensing program for its Power processor. It’s open source.

Somewhere, a Samsung executives is looking at IBM’s plan, admiring it even, and berating his or her chief of R&D asking why they haven’t concocted something similar or better.