That's where Georgetown University law professor Julie E. Cohen comes in. In a forthcoming article for the Harvard Law Review, she lays out a strong argument that addresses the titular concern "What Privacy Is For." Her approach is fresh, and as technology critic Evgeny Morozov rightly tweeted, she wrote "the best paper on privacy theory you'll get to read this year." (He was referring to 2012.)
At bottom, Cohen's argument criticizes the dominant position held by theorists and legislators who treat privacy as just an instrument used to advance some other principle or value, such as liberty, inaccessibility, or control. Framed this way, privacy is relegated to one of many defenses we have from things like another person's prying eyes, or Facebook's recent attempts to ramp up its use of facial-recognition software and collect further data about us without our explicit consent. As long as the principle in question can be protected through some other method, or if privacy gets in the way of a different desirable goal like innovation, it is no longer useful and can be disregarded.
Cohen doesn't think we should treat privacy as a dispensable instrument. To the contrary, she argues privacy is irreducible to a "fixed condition or attribute (such as seclusion or control) whose boundaries can be crisply delineated by the application of deductive logic. Privacy is shorthand for breathing room to engage in the process of ... self-development."
What Cohen means is that since life and contexts are always changing, privacy cannot be reductively conceived as one specific type of thing. It is better understood as an important buffer that gives us space to develop an identity that is somewhat separate from the surveillance, judgment, and values of our society and culture. Privacy is crucial for helping us manage all of these pressures -- pressures that shape the type of person we are -- and for "creating spaces for play and the work of self-[development]." Cohen argues that this self-development allows us to discover what type of society we want and what we should do to get there, both factors that are key to living a fulfilled life.
Woodrow Hartzog and Evan Selinger make similar arguments in a recent article on the value of "obscurity." When structural constraints prevent unwanted parties from getting to your data, obscurity protections are in play. These protections go beyond preventing companies from exploiting our information for their financial gain. They safeguard democratic societies by furthering "autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power."
In light of these considerations, what's really at stake in a feature like Facebook's rumored location-tracking app? You might think it is a good idea to willfully hand over your data in exchange for personalized coupons or promotions, or to broadcast your location to friends. But consumption -- perusing a store and buying stuff -- and quiet, alone time are both important parts of how we define ourselves. If how we do that becomes subject to ever-present monitoring it can, if even unconsciously, change our behaviors and self-perception.