The Future of Computers: From Brilliant Idiots to Learning Machines

As computers improve with technology, they are moving beyond making calculations and are beginning to analyze data and learn from it
IBM Cognitive Computing


Today, computers are brilliant idiots. They have tremendous capacities for storing information and performing numerical calculations--far superior to those of any human. Yet, when it comes to another class of skills, the capacities for understanding, learning, adapting and interacting, computers are woefully inferior to humans. As a result of these limitations, there are many situations where computers can't do a lot to help us.

The present problem is that storing information and performing calculations isn't sufficient anymore. With the emergence of Big Data movement, which provides data from sources previously not available, such as social networks and health care research, new systems need to be created to that can gain insights from these sources. This shift in technology, referred to by IBM as the era of cognitive systems, will allow for machines of the future to do much more than compute.

"Cognitive systems are fundamentally different. Traditional computers, which are still based on the blueprint that mathematician John von Neumann laid out in the 1940s, are programmed by humans to perform specific tasks," said Dr. John Kelly, IBM Senior Vice President and Director of IBM Research. "Cognitive systems are capable of learning from their interactions with data and humans--essentially continuously reprogramming themselves."

To learn more about this new era of computing, download a free sample of Smart Machines: IBM's Watson and the Era of Cognitive Computing, a new book by IBM Research Director John Kelly and author Steve Hamm.

About the author of this Post

IBM Cognitive Computing
Learn more about Cognitive Computing.
Sponsor Content

Join the Discussion

The Atlantic does not moderate these reader comments, except to the same extent comments are moderated pursuant to the Terms & Conditions generally applicable to all content on The Atlantic's sites. blog comments powered by Disqus

Applications of Cognitive Computing