A veteran is having a virtual therapy session. His counselor is named Ellie, and she is, among other things, a very good listener. She's responsive to the soldier's comments. She reads the subtleties of his facial expressions. She nods appreciatively at his insights. She grimaces, slightly, when he tells her about a trauma he experienced.
Ellie is an avatar, a virtual therapist developed at USC with funding from DARPA, the Defense Department's advanced research center. And "people love interacting with her," says Louis-Philippe Morency, a research assistant professor at USC's Institute for Creative Technologies. Morency has been working with Ellie—part of the university's SimSensei project—for several years now. In that, he has helped to build a program capable of reading and responding to human emotion in real time. And capable, more to the point, of offering those responses via a human-like animation.
To build that system, the SimSensei team took a three-step approach: first, they analyzed actual humans interacting, to observe the linguistic and behavioral nuances of those conversations. From there, they created a kind of intermediate step they nicknamed the "Wizard of Oz." This was "a virtual human," as Morency describes it, "with a human behind, pressing the buttons." Once they had a framework for the rhythms of a face-to-face therapy session, they added facial-movement sensors and dialogue managers—creating a system, all in all, that can read and react to human emotion.