Cognitive Computing Systems: As They Shall Reason

As computer learning capabilities grow, they will become co-collaborators with humans.
Grady Booch, IBM Fellow


During World War II, Vannevar Bush served as director of the U. S. Office of Scientific Research and Development (OSRD), from which arose projects ranging from the very pragmatic (such as the proximity fuse), to the speculative (the ENIAC comes to mind), to the profound (namely, the Manhattan Project). After the war, Bush turned his attention to the opportunities of leveraging the scientific infrastructure that was born in conflict to the needs of a world at relative peace. In 1945, in The Atlantic, Bush published "As We May Think" in which he described the Memex. As he wrote, "The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it."

Indeed, much has come of it. Bush's vision laid the foundation for the fundamental mechanisms of the Web, and his work helped set into motion much of what constitutes modern computing. Were he with us today, I think he would be astonished to discover just how far computing has come, and just how much it has woven itself into the world.

And, we are just beginning.

A generation is now with us who has been born digital, and does not know a world before the Internet or the personal computer or the smart phone. This is a world possessing abundant computational power, with systems that never forget and connectivity that binds us in many ways, sometimes beyond our choice. As a result, we are faced with exquisite opportunity to use those digital inventions for the advancement of the human spirit.

Bush went on to say that "...but creative thought and essentially repetitive thought are very different things. For the latter there are, and may be, powerful mechanical aids Man cannot hope fully to duplicate this mental process artificially."

In many ways, the earliest generations of computing technology were focused on these tedious, repetitive tasks as led by the needs of commerce and industry. Later generations -- the minicomputer, the personal computer, and now the mobile computer -- changed the playing field, indeed, the world, entirely. Now, we are faced with computing systems that have woven themselves so intimately into our lives and our societies that we require them to be far more than just very fast and reliable substitutes for calculation. We desire they become our assistants, our helpmates, our partners. The clear and present reality of big data makes this even more urgent: so many elements of the world are connected in so many ways and in the process generating so much raw information that it is impossible for any human to metabolize it. As Douglas Rushkoff describes it, this is the problem of present shock.

As such, we see the rise of what some call cognitive computing. Now, the journey toward an artificial intelligence has been part of our mythology and our science for many years, but this seems to be something subtly different. We don't necessarily seek sentient computers, but rather we seek software-intensive systems that can reason, and that can attend to information that is ambiguous, self-contradictory or fragmented. We also don't expect them to be disembodied: they must sense and act in the physical world. This is the context that a human faces, but with clear differences: the human possesses feelings, goals, and values; while the machine is tireless, is capable of absorbing far more information, and can do so astonishingly fast.

Big data offers a use case that is forcing us to come to grips with how we might build such systems. Since the era of knowledge engineering systems a few decades ago, computer science has advanced considerably, and rather than using purely symbolic approaches, has found ways that are more heuristic, based more on probabilities, and able to learn. While the victory of Watson in Jeopardy! over human champions was impressive, the more important victory is that we know have a solid architectural foundation on which to apply those ideas to a vast set of domains.

Yes, we have just begun. And with this next generation of computing, they shall reason with us and in so doing become our co-collaborators.

About the author of this Post

Grady Booch, IBM Fellow
Grady Booch is a world-renowned computer scientist who is recognized for his innovative work in software engineering. Booch is an IBM Fellow -- IBM’s highest technical position -- and serves as Chief Scientist at IBM Research. 
Sponsor Content

Join the Discussion

The Atlantic does not moderate these reader comments, except to the same extent comments are moderated pursuant to the Terms & Conditions generally applicable to all content on The Atlantic's sites. blog comments powered by Disqus

Applications of Cognitive Computing