Data Doppelgängers and the Uncanny Valley of Personalization

Why customized ads are so creepy, even when they miss their target

"What is it about my data that suggests I might be a good fit for an anorexia study?" That's the question my friend Jean asked me after she saw this targeted advertisement on her Facebook profile:

Facebook/Massachusetts General Hospital

She came up with a pretty good hypothesis. Jean is an MIT computer scientist who works on privacy programming languages. Because of her advocacy work on graduate student mental health, her browsing history and status updates are full of links to resources that might suggest she's looking for help. Maybe Facebook inferred what Jean cares about, but not why.

Days later, I saw a similar ad. Unlike Jean, I didn't have a good explanation for why I might have been targeted for the ad, which led me to believe that it could be broadly aimed at all women between the ages of 18 and 45 in the greater Boston area. (When I clicked to learn more about the study, this was listed as the target demographic.)

Still, it left us both with the unsettling feeling that something in our data suggests anorexia. Ads seem trivial. But when they start to question whether I'm eating enough, a line has been crossed. I see similar ads plastered across the Boston T recruiting participants for medical studies on diabetes, bipolar disorder, and anxiety, but their effect is materially different. The only reason I see those ads is because I ride the T. These messages offer the opportunity to self-select for eligibility. It's different online, where I am supposed to see ads because something about my data suggests that they are relevant to me.

Google thinks I’m interested in parenting, superhero movies, and shooter games. The data broker Acxiom thinks I like driving trucks. My data doppelgänger is made up of my browsing history, my status updates, my GPS locations, my responses to marketing mail, my credit card transactions, and my public records. Still, it constantly gets me wrong, often to hilarious effect. I take some comfort that the system doesn’t know me too well, yet it is unnerving when something is misdirected at me. Why do I take it so personally when personalization gets it wrong? 

Right now we don’t have many tools for understanding the causal relationship between our data and how third parties use it. When we try to figure out why creepy ads follow us around the Internet, or why certain friends show up in our newsfeeds more than others, it’s difficult to discern coarse algorithms from hyper-targeted machine learning that may be generating the information we see. We don’t often get to ask our machines, "What makes you think that about me?"

Personalization appeals to a Western, egocentric belief in individualism. Yet it is based on the generalizing statistical distributions and normalized curves methods used to classify and categorize large populations. Personalization purports to be uniquely meaningful, yet it alienates us in its mass application. Data tracking and personalized advertising is often described as “creepy.” Personalized ads and experiences are supposed to reflect individuals, so when these systems miss their mark, they can interfere with a person’s sense of self. It’s hard to tell whether the algorithm doesn’t know us at all, or if it actually knows us better than we know ourselves. And it's disconcerting to think that there might be a glimmer of truth in what otherwise seems unfamiliar. This goes beyond creepy, and even beyond the sense of being watched.

We’ve wandered into the uncanny valley.

* * * 

Since the 1970s, theorists have used the term "uncanny valley" to describe the unsettling feeling some technology gives us. Japanese roboticist Masahiro Mori first suggested that we are willing to tolerate robots mimicking human behaviors and physical characteristics only up to a point: When a robot looks human but still clearly isn’t.

The threshold is where we shift from judging a robot as a robot and instead hold it against human standards. Researchers at the University of Bolton in the UK have described this shift as the "Uncanny Wall" in the field of digital animation where increasing realism and technological advancements alter our expectations of how life-like technologies should be. I would argue that we hit that wall when we can't distinguish whether something is broadly or very personally targeted to us. The promise of Big Data has built up our expectations for precise messaging, yet much of advertising is nowhere near refined. So we don't know how to judge what we are seeing because we don't know what standard to hold it against.

 

Wikipedia

The uncanny valley of robotics is grounded in the social cues of the visual. We are repulsed by the plastic skin, by the stilted movements, by the soulless eyes of our robotic counterparts. In contrast, personally targeted digital experiences present a likeness of our needs and wants, but the contours of our data are obscured by a black box of algorithms. Based on an unknown set of prior behaviors, these systems anticipate intentions we might not even know we have. Our data may not be animate or embodied like a robot, but it does act with agency. Data likeness can’t be seen or touched, but neither can our sense of ourselves. This makes the uncanny even more unnerving.

Uncanny personalization occurs when the data is both too close and not quite close enough to what we know about ourselves. This is rooted in Sigmund Freud’s famous treatment of the uncanny, which he traced to the feelings associated with encountering something strangely familiar. In Freud’s original writing, the uncanny is the unheimlich—literally translated as "unhomely," and the opposite of heimlich, which is the familiar, comfortable feeling of being at home. 

Presented by

Sara M. Watson is a technology critic and a Fellow at the Berkman Center for Internet and Society at Harvard University.

The Case for Napping at Work

Most Americans don't get enough sleep. More and more employers are trying to help address that.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

The Case for Napping at Work

Most Americans don't get enough sleep. More and more employers are trying to help address that.

Video

A Four-Dimensional Tour of Boston

In this groundbreaking video, time moves at multiple speeds within a single frame.

Video

Who Made Pop Music So Repetitive? You Did.

If pop music is too homogenous, that's because listeners want it that way.

Video

Playing An Actual Keyboard Cat

A music video transforms food, pets, and objects into extraordinary instruments.

Video

Stunning GoPro Footage of a Wildfire

In the field with America’s elite Native American firefighting crew

Video

The Man Who Built a Forest Larger Than Central Park

Since 1979, he has planted more than 1,300 acres of trees.

More in Technology

Just In