IBM rethinks the transistor to keep scaling compute power

IBM has come up with a model for a new transistor, the device that’s at the heart of every chip and the foundation of our tech-heavy society. The potential IBM breakthrough is a new coating for the transistor that allows the device to read ionic signals as opposed to electric ones. This is important because it could help enable chipmakers to put more transistors on a chip.

The tech giant hasn’t actually built a chip with the new transistor; instead, it has demonstrated a rough circuit. IBM expects the technology to leave the lab within the next five to seven years, and assuming that happens it can actually be produced using the same processes used today.

Why we need a new transistor

The benefit to this fundamental shift is that we can continue to make smaller chips with more processing power and keep to the schedule set by Moore’s Law. That “law” dictates that we double the number of transistors on a chip every 18 months (or two years). However this doubling has pressed the chip industry to the limits — packing those transistors onto small chips is like cramming a bunch of angsty teenagers into a small space.

One might argue that Moore’s Law doesn’t matter, but the ever decreasing cost of computing power is the reason Google is able to deliver its awesome search index for the pennies people pay in search advertising, or why Facebook can spend hundreds of millions on its infrastructure and still not charge you a thing.

A big problem associated with smaller transistors (and more of them) on a chip is leakage — electrons are noisy and they lose a lot of power and heat. This new coating and the use of ions as signals reduces leakage, which means chipmakers can continue placing more of them on the chip and the cost for computing will continue going down.

And that is a good thing.

Related research and analysis from GigaOM Pro:
Subscriber content. Sign up for a free trial.

  • The importance of putting the U and I in visualization
  • A near-term outlook for big data
  • Dissecting the data: 5 issues for our digital future


GigaOM