Obscurity: A Better Way to Think About Your Data Than 'Privacy'

More

Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.

102029925_d14d2e8b69_b-615.jpg

(tajai/Flickr)

Facebook's announcement of its new Graph search tool on Tuesday set off yet another round of rapid-fire analysis about whether Facebook is properly handling its users' privacy. Unfortunately, most of the rapid-fire analysts haven't framed the story properly. Yes, Zuckerberg appears to be respecting our current privacy settings. And, yes, there just might be more stalking ahead. Neither framing device, however, is adequate. If we rely too much on them, we'll miss the core problem: the more accessible our Facebook information becomes, the less obscurity protects our interests.

While many debates over technology and privacy concern obscurity, the term rarely gets used. This is unfortunate, as "privacy" is an over-extended concept. It grabs our attention easily, but is hard to pin down. Sometimes, people talk about privacy when they are worried about confidentiality. Other times they evoke privacy to discuss issues associated with corporate access to personal information. Fortunately, obscurity has a narrower purview.

Obscurity is the idea that when information is hard to obtain or understand, it is, to some degree, safe. Safety, here, doesn't mean inaccessible. Competent and determined data hunters armed with the right tools can always find a way to get it. Less committed folks, however, experience great effort as a deterrent.

Online, obscurity is created through a combination of factors. Being invisible to search engines increases obscurity. So does using privacy settings and pseudonyms. Disclosing information in coded ways that only a limited audience will grasp enhances obscurity, too. Since few online disclosures are truly confidential or highly publicized, the lion's share of communication on the social web falls along the expansive continuum of obscurity: a range that runs from completely hidden to totally obvious.

Discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy.

Legal debates surrounding obscurity can be traced back at least to U.S. Department of Justice v. Reporters Committee for Freedom of the Press (1989). In this decision, the United States Supreme Court recognized a privacy interest in the "practical obscurity" of information that was technically available to the public, but could only be found by spending a burdensome and unrealistic amount of time and effort in obtaining it. Since this decision, discussion of obscurity in the case law remains sparse. Consequently, the concept remains under-theorized as courts continue their seemingly Sisyphean struggle with finding meaning in the concept of privacy.

Many contemporary privacy disputes are probably better classified as concern over losing obscurity. Consider the recent debate over whether a newspaper violated the privacy rights of gun owners by publishing a map comprised of information gleaned from public records. The situation left many scratching their heads. After all, how can public records be considered private? What obscurity draws our attention to, is that while the records were accessible to any member of the public prior to the rise of big data, more effort was required to obtain, aggregate, and publish them. In that prior context, technological constraints implicitly protected privacy interests. Now, in an attempt to keep pace with diminishing structural barriers, New York is considering excepting gun owners from "public records laws that normally allow newspapers or private citizens access to certain information the government collects."

The obscurity of public records and other legally available information is at issue in recent disputes over publishing mug shots and homeowner defaults. Likewise, claims for "privacy in public," as occur in discussion over license-plate readers, GPS trackers, and facial recognition technologies, are often pleas for obscurity that get either miscommunicated or misinterpreted as insistence that one's public interactions should remain secret.

Obscurity received some attention when Facebook previously rolled out Timeline. The Electronic Privacy Information Center, for example, was dismayed by how easy the design made it to retrieve past posts -- including ones that previously required extensive manual searching to locate. 

Alas, the two dominant ways of discussing Graph have not had that same focus on obscurity. One narrative suggests that since Graph will only reveal information to users that was previously visible to them or publicly shared, it presents no new privacy issues. As Facebook hammered home, a user's original privacy settings are neither altered nor violated. According to Kashmir Hill, "Zuckerberg and crew emphasized the 'privacy awareness' of the new search engine."

"You want a search tool that gives you access to just things that people have shared with you," said Zuckerberg.

"I can only search for what I can already see on Facebook," added director of product management, Tom Stocky.

"You can only search for the content people have shared with you," re-emphasized software developer Lars Rasmussen.

Respecting Facebook users' privacy settings is no small feature, due to the harm that can result when privacy settings are given too little weight in socio-technical design. Thanks to the soothing message and intuitive appeal of the "self-selected insiders" narrative, many reporters are spreading its gospel. Wired and CNN, among others, note Graph doesn't expose any information that wasn't already available on Facebook.

Ultimately, the "you choose who to let in" narrative is powerful because it trades on traditional notions of space and boundary regulation, and further appeals to our heightened sense of individual responsibility, and, possibly even vanity. The basic message is that so long as we exercise good judgment when selecting our friends, no privacy problems will arise. What this appeal to status quo relations and existing privacy settings conceals is the transformative potential of Graph : new types of searching can emerge that, due to enhanced frequency and newly created associations between data points, weaken, and possibly obliterate obscurity. Of course, that result won't bother everyone. Some users won't miss their obscurity havens, while others will find the change dismaying. As we'll clarify shortly, those who become dismayed will have good reason for being upset.

The other dominant narrative emerging is that the Graph will simplify "stalking." Kashmir Hill states, "Good news for snoops: the new tool will make Facebook stalking much easier." Megan Rose Dickey wrote an article titled "Facebook's Graph Search Is Awesome For Stalkers And Hookups." While utilization of the "stalker" frame brings us a little closer to articulating what the harm from the Graph might be, it, too, is inadequate.

First, the stalking frame risks creating undue psychological associations with the more severe harms of stalking, as legally defined and prohibited. Yes, we recognize these accounts use "stalking" colloquially. But words have power, and such deliberatively evocative rhetoric unduly muddies the already murky conceptual waters.

Second, because of this, the stalker frame muddies the concept, implying that the problem is people with bad intentions getting our information. Determined stalkers certainly pose a threat to the obscurity of information because they represent an increased likelihood that obscure information will be found and understood. Stalkers seek and collect information with varying degrees of rigor. But as social search moves from an atomistic to composite form, many harms resulting from loss of obscurity will likely be accidental. Well-intentioned searches can be problematic, too.

Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.

Consider the following hypothetical to demonstrate this point. Mark Zuckerberg mentioned that Graph is still in beta and many new features could be added down the road. It is not a stretch to assume Graph could enable searching through the content of posts a user has liked or commented on and generating categories of interests from it. For example, users could search which of their friends are interested in politics, or, perhaps, specifically, in left-wing politics. While many Facebook users are outspoken on politics, others hold these beliefs close. For various reasons, these less-outspoken users might still support the political causes of their friends through likes and comments, but refrain from posting political material themselves. In this scenario, a user who wasn't a fan of political groups or causes, didn't list political groups or causes as interests, and didn't post political stories, could still be identified as political. The Graph would wrench these scattered showings of support from the various corners of Facebook into a composite profile that presents both obscurity and accuracy concerns.

The final reason the stalker frame is not a good fit for Graph is that it implies the harm at stake is the experience of feeling "creeped out." While the term 'creepy' isn't appearing as much as with other Facebook-related stories, it is still a non-trivial aspect of the Graph narrative. As one of us has previously posited, due to its vagueness and heightened emotional resonance, 'creepy' is not a helpful term to use when identifying the harm that might result from new technologies.

Some of the chatter about Graph and privacy belies the optimistic belief that Facebook will not diminish too much obscurity in order to keep consumers happy and willing to post their lives away. Facebook regularly emphasizes the importance of users believing that posting on Facebook is safe. But is it really wise to presume Facebook's financial interests align with the user interest in maintaining obscurity? In a system that purportedly relies upon user control, it is still unclear how and if users will be able to detect when their personal information is no longer obscure. How will they be able to anticipate the numerous different queries that might expose previously obscure information? Will users even be aware of all the composite results including their information?

Accurately targeting the potential harms and interests at stake is only the first step in the debate about Graph and other similar technologies. Obscurity is a protective state that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power. A major task ahead is for society to determine how much obscurity citizens need to thrive.

Jump to comments
Presented by

Woodrow Hartzog and Evan Selinger

Woodrow Hartzog is an assistant professor at Samford University’s Cumberland School of Law and affiliate scholar at the Center for Internet and Society at Stanford Law School. Evan Selinger is an associate professor of philosophy at Rochester Institute of Technology and a fellow at the Institute for Ethics and Emerging Technology.

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Technology

Just In