Advertisement

IBM wires up 'neuromorphic' chips like a rodent's brain

IBM has been working with DARPA's Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) program since 2008 to develop computing systems that work less like conventional computers and more like the neurons inside your brain. After years of development, IBM has finally unveiled the system to the public as part of a three-week "boot camp" training session for academic and government researchers.

The TrueNorth system, as it's been dubbed, employs modular chips that act like neurons. By stringing multiple chips together researchers can essentially build an artificial neural network. The version that IBM just debuted contains about 48 million connections -- roughly the same computing capacity as a rat's brain -- over an array of 48 chips.

These systems are designed to run "deep learning" algorithms -- similar to Facebook's new facial recognition feature or Skype's insta-translate function -- but at a fraction of the cost, electrical draw and space needed by conventional data centers. For example, a TrueNorth chip contains 5.4 billion transistors but only uses 70 mw of power. An Intel processor, conversely contains just 1.4 billion transistors and draws between 35 and 140 watts.

In fact, future iterations of the TrueNorth system could (theoretically at least) be shrunk small enough to fit inside cell phones or smart watches. These chips also hold an advantage over the GPUs (graphics chips) and FPGAs (function-specific programmable chips) that the industry currently uses because TrueNorth chips operate much the same way that the deep learning algorithms running on them do, Peter Diehl, a PhD student in the cortical computation group at ETH Zurich told Wired. With it, IBM hopes to eventually shift some of the computing power requirements away from traditional data centers and onto end user devices.

This should speed up the computing process since data isn't being sent back and forth over the network. Instead, companies could simply develop a deep learning model (say, to count the number of cars in a photo), upload it to a central data server and then have the model run on the user's TrueNorth-enabled device. The system would be able to spot every car in the user's image gallery without having to upload each photo to the remote server for processing. Unfortunately, the system is still in its infancy and years away from your phone.

[Image Credit: IBM]