A veteran is having a virtual therapy session. His counselor is named Ellie, and she is, among other things, a very good listener. She's responsive to the soldier's comments. She reads the subtleties of his facial expressions. She nods appreciatively at his insights. She grimaces, slightly, when he tells her about a trauma he experienced.
Ellie is an avatar, a virtual therapist developed at USC with funding from DARPA, the Defense Department's advanced research center. And "people love interacting with her," says Louis-Philippe Morency, a research assistant professor at USC's Institute for Creative Technologies. Morency has been working with Ellie—part of the university's SimSensei project—for several years now. In that, he has helped to build a program capable of reading and responding to human emotion in real time. And capable, more to the point, of offering those responses via a human-like animation.
To build that system, the SimSensei team took a three-step approach: first, they analyzed actual humans interacting, to observe the linguistic and behavioral nuances of those conversations. From there, they created a kind of intermediate step they nicknamed the "Wizard of Oz." This was "a virtual human," as Morency describes it, "with a human behind, pressing the buttons." Once they had a framework for the rhythms of a face-to-face therapy session, they added facial-movement sensors and dialogue managers—creating a system, all in all, that can read and react to human emotion.
And to all that they added animation modules, giving a body—well, a "body"—to their program. Ellie was born.
If you have a conversation with Ellie, her creators say, she will be able to suss out symptoms of anxiety, depression, and—of particular interest to DARPA—PTSD. The avatar can also, they say, help to prepare soldiers before they've gone to the battlefield. "You want to train people on non-verbal behaviors," Morency puts it; so, for example, soldiers can be attuned to subtle facial cues from people they might encounter in a theater of war.
Morency and his team have been demonstrating Ellie and her fellow virtual-psychologists in Los Angeles, to people curious about what it's like to be analyzed by an avatar. So far, more than 500 people have talked to her. And—here's the surprising thing—they seem to enjoy the experience. The set time for each demo was initially 15 minutes; Morency says people kept extending their time with Ellie, however—up to 30 minutes. That's because, Morency figures, "they don't feel judged" by her.
And that's in turn because, as he puts it,
Ellie is an interviewer, but she is there as a computer. She doesn't have judgment directly. So people love talking to her.... they're more themselves. They're really expressing and showing something that usually if you know that people are around you—or as an interviewer—they think, 'Oh, I'm going to be careful.' But with Ellie, they're more themselves.
Morency compares the appeal, actually, to that of pets. "People, after talking to Ellie, they feel better," he points out. "Some people talk to their dogs; even though the dogs don't understand it... I think there's a little bit of that effect—just talking with someone makes you feel better."
This article available online at: