Making a Smart(er) Phone—One That's Able to Read Your Emotions


What do neuroscientists and salesmen have in common? Virtually nothing would make them happier than the ability to read your mind: the former in order to understand how the brain works, the latter to leverage that information to sell you things (apologies if you were expecting a punch line to the opening questions). People are generally very good at reading emotions in others based upon factors such as tone of voice, facial expression, and body language. But what about machines? Through artificial intelligence and principles of human-computer interaction can they be taught to read emotions too?

This is precisely what a team of researchers at the Samsung Advanced Institute of Technology in South Korea are working toward by enabling smartphones capable of inferring user emotion. As reported in MIT's Technology Review:

Rather than relying on specialized sensors or cameras, the phone infers a user's emotional state based on how he's using the phone.

For example, it monitors certain inputs, such as the speed at which a user types, how often the "backspace" or "special symbol" buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology's Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one's mental state, which the software's machine-learning algorithms can detect with an accuracy of 67.5 percent.

Once a phone infers an emotional state, it can then change how it interacts with the user:

The system could trigger different ringtones on a phone to convey the caller's emotional state or cheer up someone who's feeling low. "The smartphone might show a funny cartoon to make the user feel better," he says.

As another example, one could imagine that our phones and Siri-esque assistants may peek at our calendars and be able to tell when we are busy, and therefore more likely to be stressed. Similarly, contextual cues may be used in conjunction to predict emotional state, e.g. whether the user is experiencing bad weather or a traffic jam based on geolocation. We at Medgadget are excited to see how this innovative application of technology develops.  Maybe in the not too distant future our phones will also double as our psychiatrists.

This post also appears on medGadget, an Atlantic partner site.

Jump to comments
Presented by

medGadget is written by a group of MDs and biomedical engineers.

Get Today's Top Stories in Your Inbox (preview)

This Short Film Skewers Hollywood, Probably Predicts Disney's Next Hit

A studio executive concocts an animated blockbuster. Who cares about the story?

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus


A Short Film That Skewers Hollywood

A studio executive concocts an animated blockbuster. Who cares about the story?


In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.


What Is a Sandwich?

We're overthinking sandwiches, so you don't have to.


How Will Climate Change Affect Cities?

Urban planners and environmentalists predict the future of city life.


The Inner Life of a Drag Queen

A short documentary about cross-dressing, masculinity, identity, and performance


Let's Talk About Not Smoking

Why does smoking maintain its allure? James Hamblin seeks the wisdom of a cool person.



More in Health

Just In