Daniel Kahneman on 'Emergent Weirdness' in Artifical Intelligences

More

Our machines' computational biases are not the same as our brain's cognitive biases, which is going to be weird

creepyAI-body.jpg

Daniel Kahneman is a Nobel laureate, eminent Princeton psychologist, and godfather of behavioral economics. So, when he speaks about emergent phenomenon, it's probably worth listening.

In this case, he submitted himself to a group Q&A by the readers of Freakonomics, and someone asked him about Apple's Siri and how artificial intelligence more generally might reflect human cognitive biases.

Q. With the launch of Siri and a stated aim to be using the data collected to improve the performance of its AI, should we expect these types of quasi-intelligences to develop the same behavioral foibles that we exhibit, or should we expect something completely different? And if something different, would that something be more likely to reflect the old "rational" assumptions of behavior, or some totally other emergent set of biases and quirks based on its own underlying architecture? My money's on emergent weirdness, but then, I don't have a Nobel Prize. -Peter Bennett

A. Emergent weirdness is a good bet. Only deduction is certain. Whenever an inductive short-cut is applied, you can search for cases in which it will fail. It is always useful to ask "What relevant factors are not considered?" and "What irrelevant factors affect the conclusions?" By their very nature, heuristic shortcuts will produce biases, and that is true for both humans and artificial intelligence, but the heuristics of AI are not necessarily the human ones.

The emphasis above is mine. If what he's saying is a little opaque, let me unpack it. Human brains take shortcuts in making decisions. Finding where those shortcuts lead us to dumb places is what his life work has been all about. Artificial intelligences, say, Google, also have to take shortcuts, but they are *not* the same ones that our brains use. So, when an AI ends up in a weird place by taking a shortcut, that bias strikes us as uncannily weird.

Get ready, too, because AI bias is going to start replacing human cognitive bias more and more regularly.

Via @FelixSalmon

Image: imredesiuk/Shutterstock.

Jump to comments
Presented by

Alexis C. Madrigal

Alexis Madrigal is a senior editor at The Atlantic, where he oversees the Technology Channel. He's the author of Powering the Dream: The History and Promise of Green Technology. More

The New York Observer calls Madrigal "for all intents and purposes, the perfect modern reporter." He co-founded Longshot magazine, a high-speed media experiment that garnered attention from The New York Times, The Wall Street Journal, and the BBC. While at Wired.com, he built Wired Science into one of the most popular blogs in the world. The site was nominated for best magazine blog by the MPA and best science Web site in the 2009 Webby Awards. He also co-founded Haiti ReWired, a groundbreaking community dedicated to the discussion of technology, infrastructure, and the future of Haiti.

He's spoken at Stanford, CalTech, Berkeley, SXSW, E3, and the National Renewable Energy Laboratory, and his writing was anthologized in Best Technology Writing 2010 (Yale University Press).

Madrigal is a visiting scholar at the University of California at Berkeley's Office for the History of Science and Technology. Born in Mexico City, he grew up in the exurbs north of Portland, Oregon, and now lives in Oakland.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

Just In