Tech writers have become so focused on privacy that they no longer see the cultural context privacy exists within.
We all agree that the Girls Around Me app is "creepy," but let's step back and look at the conversation around the creepy app and see what that reveals about how we think about, write on and understand technology, privacy and issues around sex and gender. Tech writers, programmers, Silicon Valley entrepreneurs and the like sometimes get so focused on new technologies that the social world around them, full of people and politics and power, fades from focus and the world of the database, of circuits, numbers and information comes to be all that matters.
This was my reaction when reading the many articles each tech outlet produced in response to the Girls Around Me smartphone application. You probably know of the app by now, but, simply, it combined Facebook and FourSquare information to display women nearby. It used public profile and check-in information to provide the women's names, photos, physical location and much else. The "girls around you" did not opt-in to the app explicitly, but rather this information was scraped from the other sites. The app has since been voluntarily pulled, but may be re-introduced in the future.
Most articles about Girls Around Me rightly call it "creepy" and engage in an important conversation about what data one makes public and how that information can be misused. The tenor of the reaction is that this kind of app is inevitable: a public Facebook profile + a public Four-Square check-in = Girls Around Me (as was tweeted to me).
But this equation misses a massively important variable: sexism. It is no coincidence the app is called Girls Around Me. The outcry has as much to do with sexual politics as it does about data and privacy. Meanwhile, articles have ignored the sexism inherent in the app and instead only talk about data and privacy.
Alexis Madrigal wrote a post here in The Atlantic claiming to deconstruct the "creepiness" of the Girls Around Me app. Deconstructing why this app is creepy requires thinking about female visibility in our culture. He does call the app "piggish," "creepy," and "crude" but does not go so far as to call it "sexist" or go into detail on just what makes the app piggish, creepy, and crude. This would have involved, at a minimum, discussing the objectification of women inherent in the straight-male gaze or maybe the tendency to blame women for not covering up in the first place. More angles could be covered, and these are all conversations routinely held outside of academic circles.
Another article states that "[the] outcry around Girls Around Me once again raised the issue of online privacy, highlighting the question of how much data one should share publicly to stay safe." This is an important question, but what about questioning the culture that has made this data a safety issue in the first place?
Article after article views Girls Around Me as only a data issue and not also a sexism issue. The start and conclusion of these posts is that "Girls Around Me and its ilk will convince these people to take their own privacy seriously" instead of concluding that maybe there is sexism going on as well.
Simply, "data" is part of the story, but so is the sexist culture that data exists within. The gender politics of visibility are too complicated and important to dismiss as "obvious."
The data angle only explains why an app locating people might exist, but without mentioning the politics of all of this we are not addressing why the app is gendered the way it is, the coding and design choices that were made as well the real consequences this kind of app has. Tech writers, unlike the Girls Around Me app developers, have their hearts in the right place, but, like the app developers, have focused too much on the data and have forgotten the social world in which the data is situated.
Another consequence of the trend to only talk about data and not society, norms, politics, values and everything else confusing about the analogue world is the victim-blaming implicit in most of these articles. The cause of the problem? Women sharing data. The solution? Women need to better control their data (in fairness, Madrigal also asks what companies can do about this, but does not come to any answers; the burden is left on women). This data-centric view of "data-sharing-is-bad and control-your-data" borders on blaming the victim instead of criticizing the sexist culture that makes this data dangerous in the first place.
Kashmir Hill makes this point when stating, " 'You're too public with your digital data, ladies,' may be the new 'your skirt was too short and you had it coming.' " By looking at information-flows as well as sexual politics, Hill paints a better picture of why the Girls Around Me app is "creepy"; and it is all a lot more complicated than "obvious."
Given this, why have we collectively heralded Girls Around Me as a moment to discuss data and privacy and not also a moment to discuss sexism?
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.