Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.
Facebook's announcement of its new Graph search tool on Tuesday set off yet another round of rapid-fire analysis about whether Facebook is properly handling its users' privacy. Unfortunately, most of the rapid-fire analysts haven't framed the story properly. Yes, Zuckerberg appears to be respecting our current privacy settings. And, yes, there just might be more stalking ahead. Neither framing device, however, is adequate. If we rely too much on them, we'll miss the core problem: the more accessible our Facebook information becomes, the less obscurity protects our interests.
While many debates over technology and privacy concern obscurity, the term rarely gets used. This is unfortunate, as "privacy" is an over-extended concept. It grabs our attention easily, but is hard to pin down. Sometimes, people talk about privacy when they are worried about confidentiality. Other times they evoke privacy to discuss issues associated with corporate access to personal information. Fortunately, obscurity has a narrower purview.
Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn't mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.
Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too. Since few online disclosures are truly confidential or highly publicized, the lion's share of communication on the social web falls along the expansive continuum of obscurity: a range that runs from completely hidden to totally obvious.
Discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy.
Legal debates surrounding obscurity can be traced back at least to U.S. Department of Justice v. Reporters Committee for Freedom of the Press (1989). In this decision, the United States Supreme Court recognized a privacy interest in the "practical obscurity" of information that was technically available to the public, but could only be found by spending a burdensome and unrealistic amount of time and effort in obtaining it. Since this decision, discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy.
Many contemporary privacy disputes are probably better classified as concern over losing obscurity. Consider the recent debate over whether a newspaper violated the privacy rights of gun owners by publishing a map comprised of information gleaned from public records. The situation left many scratching their heads. After all, how can public records be considered private? What obscurity draws our attention to, is that while the records were accessible to any member of the public prior to the rise of big data, more effort was required to obtain, aggregate, and publish them. In that prior context, technological constraints implicitly protected privacy interests. Now, in an attempt to keep pace with diminishing structural barriers, New York is considering excepting gun owners from "public records laws that normally allow newspapers or private citizens access to certain information the government collects."
The obscurity of public records and other legally available information is at issue in recent disputes over publishing mug shots and homeowner defaults. Likewise, claims for "privacy in public," as occur in discussion over license-plate readers, GPS trackers, and facial recognition technologies, are often pleas for obscurity that get either miscommunicated or misinterpreted as insistence that one's public interactions should remain secret.
Obscurity received some attention when Facebook previously rolled out Timeline. The Electronic Privacy Information Center, for example, was dismayed by how easy the design made it to retrieve past posts -- including ones that previously required extensive manual searching to locate.
Alas, the two dominant ways of discussing Graph have not had that same focus on obscurity. One narrative suggests that since Graph will only reveal information to users that was previously visible to them or publicly shared, it presents no new privacy issues. As Facebook hammered home, a user's original privacy settings are neither altered nor violated. According to Kashmir Hill, "Zuckerberg and crew emphasized the 'privacy awareness' of the new search engine."
"You want a search tool that gives you access to just things that people have shared with you," said Zuckerberg.
"I can only search for what I can already see on Facebook," added director of product management, Tom Stocky.
"You can only search for the content people have shared with you," re-emphasized software developer Lars Rasmussen.
Respecting Facebook users' privacy settings is no small feature, due to the harm that can result when privacy settings are given too little weight in socio-technical design. Thanks to the soothing message and intuitive appeal of the "self-selected insiders" narrative, many reporters are spreading its gospel. Wired and CNN, among others, note Graph doesn't expose any information that wasn't already available on Facebook.