A new patent application from Microsoft points to a future in which your Kinect watches you, and sends ads based on your mood.
An en Alain/Flickr
You know the routine: Browse a commercial website and you'll start seeing ads for that website (and perhaps its competitors) sprinkled across the web. Email about an upcoming trip and ads will begin appearing across the top of your email for hotels in your destination. Even if it provokes discomfort, we have become accustomed to the idea that the trails we leave online will be mined and targeted advertisements -- those that respond to the emails we have typed and the websites we have visited -- will come into our lines of sight.
But what if that data set -- the data that informs the targeting -- expanded beyond the words and clicks we input through our keyboards and our cursors? What if it included your body language as you slumped on the couch after a long day at the office or the hug you shared with your partner upon receiving some good news? What if advertisers could see not the trail you left online, but the life you lead in your living room?
A vision for such a world is set down in the text of a patent application from Microsoft for "Targeting Advertisements Based on Emotion" released last week. There are plenty of other patents for "targeting advertisements based on emotion," many of which assess "emotion" based on the sorts of web trails we know advertisers watch. But Microsoft has something those other patent-holders don't have: the Microsoft Kinect.
The Kinect -- the motion-sensing device that allows people to play Xbox 360 games using only the moves of their bodies and the sounds of their voices -- is one of the most remarkable pieces of consumer technology on the market. It can "see" you kick a ball, steer a car, mimic a dance routine. And, as the new patent makes clear, if it can see you playing games, that same technology can observe you walking around your house, cleaning, preparing dinner, parenting, smiling, crying, kissing, laughing. Microsoft's new patent proposes that all of that activity -- that is to say, life -- is just another data trail, rich with information for advertisers, ready to be mined.
As Microsoft describes it in the ever-elegant language of patent applications:
The voice and gestures from the computing device, e.g., Microsoft Kinect, may be analyzed for speech patterns, body movement, and facial expression to determine whether the user is smiling, frowning, screaming, etc. If the user on the videos or images from the computing device, e.g., Microsoft Kinect, is screaming, the advertisement engine may assign a negative emotional state, such as, upset, to the user. If the user on the videos or images from the computing device, e.g., Microsoft Kinect, is pacing back and forth, the advertisement engine may assign a negative emotional state, such as, worried, to the user.
The Kinect isn't alone in the patent as an "indicator of emotion." Microsoft also proposes building a user's emotional-state profile with data from the old stand-bys ("browser behavior, webpage content, search queries, email, instant messages") and a few more innovative sources, such as web cameras, a player's performance in online games, and, of course, the Kinect. Once users' "emotional states" are determined, "the computer system delivers the selected advertisements with the highest monetization values to the users that are emotionally compatible."
There are two distinct but related aspects of this vision for advertising's future that are troubling, each in their own ways. (Also, it must be noted that just because Microsoft attempts to patent this system, it doesn't necessarily mean that they will implement it. It's just a sign of what they believe is possible with the technology they have.)
The first is simply that this technology could lead to specific combinations of advertisements and micro-audience that are ethically suspect. For example, as a society we decided at some point that cigarette advertising targeted at kids was off limits. What about alcohol ads targeted at depressives? What about junk food targeted at people who are obese? It's possible to imagine that a regulatory regime could mediate the pipeline from corporations to an audience -- such as with cigarettes -- but the details, not to mention the political support, would be a nightmare to develop.
The second goes to the questions of privacy that would surely be difficult if not outright insurmountable if such a technology were to go on the market. As NYU philosopher Helen Nissenbaum has argued, the particular context in which information is transmitted -- and the expectations bound up in that particular context -- are central to a determination of the proper handling of that information. Part of the problem with online privacy is that we may have an expectation of privacy (resulting in part from a lack of understanding of how ad tracking works) that is not being met. While an expectation of privacy online may be something of a cultural gray area at this point in time, the same is simply not true of what people expect about what passes inside their own homes. There is probably no place regarded as more private -- culturally, legally -- than the places where we live.
A hoax earlier this year envisioned a TV called Hearscreen that would target ads at you based on the conversations you had with your friends. The "joke" of Hearscreen was on you: "Find this troubling?" it asked. "This is exactly what happens every day on Facebook, as advertisers listen in on your social life." Microsoft's patent brings that joke home, quite literally, to where we have the strongest expectation of privacy. If we aren't troubled by crossing that line, perhaps there are no lines at all.
This article available online at: