Systems That Perceive, Think, and Act

Technological advances are allowing scientists to begin building a cognitive computer that functions like a brain.
Dr. Dharmendra Modha, Manager of Cognitive Computing, IBM

IBM_CC_article3_image.jpg


Since computers were invented, they've been called "brains."

Yet, the fundamental tasks at which computers and human brains excel, the vastly different design underlying each, and the brain's remarkable ability to learn and adapt has always set them poles apart -- until now.

By bringing together the recent advances in neuroscience, supercomputing, and nanotechnology, we're at the beginning stages of creating cognitive machines: inspired by the function, low power, and compact volume of the organic brain.

The world needs these new approaches and needs them now.  We're populating the Earth and Space with sensors, cameras, and microphones. But, the needle of information is lost in a haystack, nay, an ocean of data.  Processing this tsunami of real-time, parallel, spatio-temporal, multi-modal data would be too expensive in power and too slow in speed of response for traditional machines, but would be ideal for a brain-like computer.  Even more urgently, today's computers are hitting physical and architectural limits in their size and speed.

Modern computers are designed in the image of ENIAC, the first digital computer created in 1946, which defined what came to be known as the von Neumann architecture. Computers were meant for calculation and for handling precise, symbolic data. They separate memory from computation, have centralized processing, handle data sequentially, require programming, operate synchronously in a clock-driven fashion, are fast, and, as a result, are energy-hungry and hot.

The brain's neurons and synapses, however, form a network. The brain evolved millions of years ago in the savannah for solving the basics: getting food, fighting, fleeing, and mating and is meant for handling low-resolution, ambiguous, sub-symbolic data. It integrates memory (synapses) and computation (neurons), has distributed processing, handles data in parallel, has learning, operates asynchronously in an event-driven fashion, is slow, and, as a result, is energy-efficient and cool.

Under the auspices of Defense Advanced Research Projects Agency's SyNAPSE program, IBM and several leading universities have been working on the challenge since 2008.

IBM_injector_human.jpg
Organic technology akin to the brain's doesn't exist today and developing it would require too much time and money. Meanwhile, the pressing problem of data deluge cannot be delayed. So our key innovation is a new non-von Neumann, modular, parallel, distributed, event-driven, scalable architecture: one that can be synthesized in today's technology and simultaneously serves as a beacon for future technology to come. The new architecture, in turn, necessitates an entirely new way of thinking, programming, and learning. This is cognitive computing -- a new synthesis of silicon and software.

As a first step, in 2011, we demonstrated tiny cognitive chips, at the scale of a worm's nervous system. We taught the chips to play Pong, one of the earliest computer games, and demonstrated capabilities such as navigation, machine vision and pattern recognition. Our next-generation chip will graduate from the nervous system of worms to the nervous system of a bee. The end game is to demonstrate a system with 100 trillion synapses, at roughly human-scale that occupies merely two liters while consuming barely one kilowatt of power. To power the same capability on today's computers would require, arguably, a nuclear reactor. In contrast, we want to build, literally, a brain-in-a-box!

The quest will require significant time, resources, and innovation, but will unleash a cognitive computing revolution. These new systems will pull data, including sights, sounds, and smells, from massive arrays of sensors and draw conclusions from them, turning the sensors themselves into computers.

In the future, these chips could power low-energy, light-weight glasses that help blind people navigate; "eyes" that let robots and cars see; health-care systems that monitor blood pressure, temperature, and oxygen levels of the elderly at home and send alerts before problems occur; and systems that measure the tide, air, and wind speed to predict tsunamis.

These cognitive computing chips and today's existing computers will complement each other, like yin and yang, mapping new ways to improve the world's productivity and sustainability.

About the author of this Post

Dr. Dharmendra Modha, Manager of Cognitive Computing, IBM
Dr. Dharmendra Modha is the founder of IBM’s Cognitive Computing group at IBM Research in Almaden, Calif., and is the principal investigator for DARPA SyNAPSE team globally. In this role, Dr. Modha leads a global team across neuroscience, nanoscience and supercomputing to build a computing system that emulate the brain’s abilities for perception, action, and cognition – all while consuming many orders of magnitudes less power and space than today’s computers.
Sponsor Content Presented By

Join the Discussion

The Atlantic does not moderate these reader comments, except to the same extent comments are moderated pursuant to the Terms & Conditions generally applicable to all content on The Atlantic's sites. blog comments powered by Disqus

Applications of Cognitive Computing