IBM researchers get closer to brain-like computing

IBM researchers say they have whipped up a programming language, algorithms and applications to deploy on top of a computing system inspired by the human brain.

This is the latest progress IBM and collaborating groups have made for a little project from the federal Defense Advanced Research Projects Agency (DARPA) dubbed Systems of Neuromorphic Adaptive Plastic Scalable Electronics, or SyNAPSE.

“It’s a very modest goal — it’s to build a brain-like computer,” said Dharmendra Modha, principal investigator and senior manager at IBM Research.

In recent years, Modha and his fellow researchers have simulated a brain larger than a cat cortex and built chips that function like a human brain (pictured), adapting in response to new information in real time.

Simulating a brain in software

Researchers have since developed a massively parallel, multi-threaded software simulator for its brain-like architecture. The simulation covers 2 billion neurosynaptic cores all connected with one another, representing around 100 trillion synapses, which is roughly equivalent to the number inside of a human brain, Modha said.

Each one of those neurosynaptic cores can be thought of as a building block, which contains 256 neurons recreated on chips for computation, 256 replicated axons for bringing information to the core from external neurons and tens of thousands of synapses providing memory.

In addition to the simulation, researchers have also built a neuron mathematical model, or a set of equations, that can be easily implemented with a few transistors in silicon, Modha said.

A language to speak brain tongue

They’ve created a new object-oriented Corelet programming language for designing, developing and debugging corelets, which are abstractions of a network of neurosynaptic cores. The corelet system and language are more appropriate than traditional computing methods for handling lots of simple tasks and processing them at the same time the way a human brain does.

The researchers also formulated an environment in which developers can work with the corelets and wrote a bunch of algorithms and applications — such as predicting sequences, detecting motion or looking at a music score from Bach or Beethoven and identifying the composer. Finally, they created a curriculum, so others can learn about the Corelet language, the library of algorithms and applications, the chip simulator and other parts of the project.

DARPA has given IBM $ 12 million to take the SyNAPSE forward into the next phase. The money will go toward improving hardware, software and education, a spokesman wrote in an email. Future work could involve packing more neurons into physical gear and testing for more complex perception of the outside environment.

Cognitive computing has been a focus at IBM at least as far back as 2006, and the company was simulating neural nets as early as 1956.

Imagine the possibilities

These latest breakthroughs bring IBM — and DARPA — a few steps closer to being able to run a sensor-equipped device at a very small size and with very low power use. That could lead to lots of custom applications, such as drones and ground-based unmanned vehicles, but it could also be mass-produced to usher in new types of personal and industrial computing down the line.

But Modha didn’t speak of products in the funnel. So we are left to wonder with our brains — and now it’s clearer just how powerful they actually are — what other tiny technology could result from the work and how soon ordinary people will have access to it.

To see Modha talk at length about the SyNAPSE project, check out this 2010 video interview with him and my colleague Stacey Higginbotham:


Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • Analyzing the wearable computing market
  • The case for low-power servers in the data center
  • Data center, meet the smart grid


GigaOM