Deconstructing the Creepiness of the 'Girls Around Me' App—and What Facebook Could Do About It

More

Social networks and the app ecosystems that surround them may find themselves at odds over user privacy.

girlsaroundme1.jpg

Cult of Mac's original screencap of the Girls Around Me application.

Last week, Cult of Mac had a fascinating, stomach-churning story about an application called Girls Around Me that scraped public Foursquare and Facebook checkins onto a map that showed people in your vicinity. Its branding was crass -- "In the mood for love, or just after a one-night stand? Girls Around Me puts you in control!" -- but, as the developers of the app argued, they had technically done nothing wrong aside from being piggish and crude.

Oddly, they were right.

They took publicly available data and put it on a map. The sexysexy frame they put around it made it *seem* creepier, but in terms of the data they accessed and presented, everything was within the rules of the game. They had done nothing that couldn't be done by another app developer.

This is basically how app ecosystems working with data from Foursquare and Facebook and Twitter are supposed to work. Some people out there get an idea for something that the main services had never thought of and they build it out of whatever data is available.

Using the traditional privacy idea that once something's public, it is public for any purpose, you're lead down a very narrow path of reasoning about this app's appropriateness and what the services it was built on could do about it. If app developers can use any data that's out there and there is data that can be assembled into a creepy application, then there will be creepy applications. Foursquare, Apple, and Facebook are absolved from responsibility because of the technical impossibility of checking on each and every app's creepiness factor. As Amit Runchal put it on his blog:

The implicit blame in these conversations is confusing. What, exactly, is Foursquare et al. expected to do? Comb through each and every request to their API? As of last year there were 10,000 developers using the Foursquare API. There are 146,000 publishers in iOS and likely a similar number to be found on Facebook. Collectively there are well over a million apps between all three platforms, with far fewer employees available to review these apps. What are the solutions? Hire more employees? There will never be enough. Do more in-depth testing of apps? That will merely slow down the already frustrating-to-many approval process and likely threaten the ecosystem that we have come to depend upon. And still creepiness will get through. It's inevitable.

This is where we have to break down the concept of creepiness. And precisely where NYU philosopher Helen Nissenbaum's concept of privacy in context is so important. First, a quick recap of her idea from last week's story:

The standard explanation for privacy freakouts is that people get upset because they've "lost control" of data about themselves or there is simply too much data available. Nissenbaum argues that the real problem "is the inapproproriateness of the flow of information due to the mediation of technology." In her scheme, there are senders and receivers of messages, who communicate different types of information with very specific expectations of how it will be used. Privacy violations occur not when too much data accumulates or people can't direct it, but when one of the receivers or transmission principles change. The key academic term is "context-relative informational norms." Bust a norm and people get upset.

According to the traditional privacy framework, there is no reason for people to get upset about Girls Around Me, or less crass apps that do the same thing. They'd already given up their data to the public, so why was anyone upset? But using Nissenbaum's theory, the bad feelings that people have around the app make sense: People gave data to Foursquare or Facebook in one context and then it showed up in another context that they weren't expecting.

I still take Runchal's point: as long as there are thousands of applications being produced using data from Foursquare, Twitter, and Facebook, it is going to be very, very hard to keep these kinds of contextual privacy violations from occurring. I would go so far as to say that Nissenbaum's ideas challenge the very notion of how application ecosystems are supposed to work. Social app developers, almost by definition, are supposed to come up with new uses -- undreamed-of uses -- for people's personal data.

As more data about people flows into the Internet's core social networks, I think their role as platforms that feed information to ecosystems of apps and their role as services that people use directly may find themselves at odds. Doing one of these tasks well might preclude you from doing the other well, too. After all, retaining the contextual integrity of information is how you get people to share with you in the first place. Think about it this way: Who would you tell your secrets to? The person who keeps his mouth shut or the person who allows your information to be repurposed because other people might find it entertaining?

If violations like this continue, respecting the context in which data's given might not just be a good privacy practice, it might become a good business practice. And I'm not sure what that would mean for app developers who've become dependent on data from the big services.

Jump to comments
Presented by

Alexis C. Madrigal

Alexis Madrigal is the deputy editor of TheAtlantic.com. He's the author of Powering the Dream: The History and Promise of Green Technology. More

The New York Observer has called Madrigal "for all intents and purposes, the perfect modern reporter." He co-founded Longshot magazine, a high-speed media experiment that garnered attention from The New York Times, The Wall Street Journal, and the BBC. While at Wired.com, he built Wired Science into one of the most popular blogs in the world. The site was nominated for best magazine blog by the MPA and best science website in the 2009 Webby Awards. He also co-founded Haiti ReWired, a groundbreaking community dedicated to the discussion of technology, infrastructure, and the future of Haiti.

He's spoken at Stanford, CalTech, Berkeley, SXSW, E3, and the National Renewable Energy Laboratory, and his writing was anthologized in Best Technology Writing 2010 (Yale University Press).

Madrigal is a visiting scholar at the University of California at Berkeley's Office for the History of Science and Technology. Born in Mexico City, he grew up in the exurbs north of Portland, Oregon, and now lives in Oakland.

Get Today's Top Stories in Your Inbox (preview)

What Crazy Tech Idea Could Become Real?

"There could be great intelligence enhancements, like infinite memory."


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

A Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier

Video

What Do You Wish You Learned in College?

Ivy League academics reveal their undergrad regrets

Video

Famous Movies, Reimagined

From Apocalypse Now to The Lord of the Rings, this clever video puts a new spin on Hollywood's greatest hits.

Video

What Is a City?

Cities are like nothing else on Earth.

Video

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Writers

Up
Down

More in Technology

Just In