Making a Smart(er) Phone—One That's Able to Read Your Emotions


What do neuroscientists and salesmen have in common? Virtually nothing would make them happier than the ability to read your mind: the former in order to understand how the brain works, the latter to leverage that information to sell you things (apologies if you were expecting a punch line to the opening questions). People are generally very good at reading emotions in others based upon factors such as tone of voice, facial expression, and body language. But what about machines? Through artificial intelligence and principles of human-computer interaction can they be taught to read emotions too?

This is precisely what a team of researchers at the Samsung Advanced Institute of Technology in South Korea are working toward by enabling smartphones capable of inferring user emotion. As reported in MIT's Technology Review:

Rather than relying on specialized sensors or cameras, the phone infers a user's emotional state based on how he's using the phone.

For example, it monitors certain inputs, such as the speed at which a user types, how often the "backspace" or "special symbol" buttons are pressed, and how much the device shakes. These measures let the phone postulate whether the user is happy, sad, surprised, fearful, angry, or disgusted, says Hosub Lee, a researcher with Samsung Electronics and the Samsung Advanced Institute of Technology's Intelligence Group, in South Korea. Lee led the work on the new system. He says that such inputs may seem to have little to do with emotions, but there are subtle correlations between these behaviors and one's mental state, which the software's machine-learning algorithms can detect with an accuracy of 67.5 percent.

Once a phone infers an emotional state, it can then change how it interacts with the user:

The system could trigger different ringtones on a phone to convey the caller's emotional state or cheer up someone who's feeling low. "The smartphone might show a funny cartoon to make the user feel better," he says.

As another example, one could imagine that our phones and Siri-esque assistants may peek at our calendars and be able to tell when we are busy, and therefore more likely to be stressed. Similarly, contextual cues may be used in conjunction to predict emotional state, e.g. whether the user is experiencing bad weather or a traffic jam based on geolocation. We at Medgadget are excited to see how this innovative application of technology develops.  Maybe in the not too distant future our phones will also double as our psychiatrists.

This post also appears on medGadget, an Atlantic partner site.

Jump to comments
Presented by

medGadget is written by a group of MDs and biomedical engineers.

Get Today's Top Stories in Your Inbox (preview)

What's the Number One Thing We Could Do to Improve City Life?

A group of journalists, professors, and non-profit leaders predict the future of livable, walkable cities

Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus


Adventures in Legal Weed

Colorado is now well into its first year as the first state to legalize recreational marijuana. How's it going? James Hamblin visits Aspen.


What Makes a Story Great?

The storytellers behind House of CardsandThis American Life reflect on the creative process.


Tracing Sriracha's Origin to Thailand

Ever wonder how the wildly popular hot sauce got its name? It all started in Si Racha.


Where Confiscated Wildlife Ends Up

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.


Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.


The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air



More in Health

Just In