Making a Smart(er) Phone—One That's Able to Read Your Emotions

What do neuroscientists and salesmen have in common? Virtually nothing would make them happier than the ability to read your mind: the former in order to understand how the brain works, the latter to leverage that information to sell you things (apologies if you were expecting a punch line to the opening questions). People are generally very good at reading emotions in others based upon factors such as tone of voice, facial expression, and body language. But what about machines? Through artificial intelligence and principles of human-computer interaction can they be taught to read emotions too?

This is precisely what a team of researchers at the Samsung Advanced Institute of Technology in South Korea are working toward by enabling smartphones capable of inferring user emotion. As reported in MIT's Technology Review:

Rather than relying on specialized sensors or cameras, the phone infers a user's emotional state based on how he's using the phone.

For example, it monitors certain inputs, such as the speed at which a user types, how often the "backspace" or "special symbol" buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology's Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one's mental state, which the software's machine-learning algorithms can detect with an accuracy of 67.5 percent.

Once a phone infers an emotional state, it can then change how it interacts with the user:

The system could trigger different ringtones on a phone to convey the caller's emotional state or cheer up someone who's feeling low. "The smartphone might show a funny cartoon to make the user feel better," he says.

As another example, one could imagine that our phones and Siri-esque assistants may peek at our calendars and be able to tell when we are busy, and therefore more likely to be stressed. Similarly, contextual cues may be used in conjunction to predict emotional state, e.g. whether the user is experiencing bad weather or a traffic jam based on geolocation. We at Medgadget are excited to see how this innovative application of technology develops.  Maybe in the not too distant future our phones will also double as our psychiatrists.

This post also appears on medGadget, an Atlantic partner site.

Presented by

medGadget is written by a group of MDs and biomedical engineers.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register with Disqus.

Please note that The Atlantic's account system is separate from our commenting system. To log in or register with The Atlantic, use the Sign In button at the top of every page.

blog comments powered by Disqus


A Stop-Motion Tour of New York City

A filmmaker animated hundreds of still photographs to create this Big Apple flip book


The Absurd Psychology of Restaurant Menus

Would people eat healthier if celery was called "cool celery?"


This Japanese Inn Has Been Open for 1,300 Years

It's one of the oldest family businesses in the world.


What Happens Inside a Dying Mind?

Science cannot fully explain near-death experiences.

More in Health

Just In