Forget About It: Making the Internet More Like Our Brains

The next wave of digital products won't just be about archiving the web; they'll be about destroying the archive.

[optional image description]
Niagara Falls / Artur Staszewski/Flickr

Snapchat is an iPhone app that, fascinatingly and maybe even usefully, lets you apply a time limit to the photos you share with friends. You can decide whether your recipient (or a group of recipients) sees a photo for 2 seconds, or 5, or 10 ... before what they see disappears entirely. Think Path, with a focus on photos. Think Instagram, with an expiration date.

Since Snapchat allows users to send pictures to each other with minimal slightly less fear of those pictures being seen by the wrong people, its most obvious use, Nick Bilton pointed out today, is -- yep -- sending suggestive photos. But the app's blink-and-you-miss-it UI speaks, even more broadly, to a market for something much broader than just sexting. Snapchat is a silly entry in a burgeoning genre: products that harness the power not of memory, but of forgetting.

Anti-archival tools provide a countervailing force to one of the defining features of the Internet: that, with its nearly infinite space, "save all" is its default setting. Without even trying, the Internet remembers. And that doesn't just mean that the comment you left on that Joss Whedon fan site that one time is still sitting there, emoticon-ed and gif-ed and captured for posterity within the all-knowing neurons of Google. It also means that the web, as a broad space, operates on both an assumption and an architecture of continuity. Within it, and all around it, archive is assumed. Even when we die ... there, still, we are.

So when we talk about the Internet, we talk about feeds and flows and rivers and currents -- things determined by their dynamism and their lack of obvious containers.

[optional image description]

And: That's great! It's what makes the Internet the Internet! The only problem, however, is that constant flux-and-flow is not actually how we humans are programmed to move through the world. We live in fits and starts, in cycles and phases, and we divide our time not just socially, in shared minutes and hours, but physically. We wake. We sleep. We have beginnings. We have endings.

Which means that, to the extent that the web is a realization of Wells's World Brain, it suffers from a congenital defect. Its capacities and ours are misaligned. We little humans are defined by our (sometimes painfully) selective memories; the web is defined by its promiscuity. It doesn't sleep; it doesn't process; it never, never rests. And while we humans can control our experience of the web -- just because everything's archived doesn't mean that we're forced to consume it -- its own lavish memory changes the way we users think about remembering itself. We become cavalier about preservation, not just because Google serves as an outboard brain, but because we are conditioned to assume that the stuff we care about will automatically stick around.

You'd think that would be liberating. And, for the most part, it is. (The history! The timeline! The cloud!) But there are also drawbacks to digital omniscience. It's telling that people diagnosed with hyperthymesia have described their limitless memories not as blessings, but as burdens -- ones that are "non-stop, uncontrollable, and totally exhausting." Near-perfect recall of their experiences doesn't make these people smarter; it makes them miserable.

Same deal, to a large extent, with the web. That's one reason people decry "information overload," not to mention a reason for proposals of digital sabbaths and the like. The web may be, in the broad cybernetic sense, a brain; as a user experience, though, it has some faulty wiring. When we disparage the digital environment as "overwhelming," what we're also faulting it for is its lack of a narrative. The Internet moves, but it doesn't necessarily move forward. It expands, but it doesn't necessarily follow any particular trajectory. It lacks, in that sense, a purpose. It lacks a plot. Men die, the Greek physician Alcmaeon believed, "because they cannot join the beginning and the end."

What we're beginning to realize, though, is that the World Brain, like our own comparatively fragile version, can be subject to neuroplasticity. We can change the web's wiring. We can make it more hospitable to the way our minds are programmed to work. The proposed legal principle of le droit à l'oubli -- the "right to be forgotten," but also, tellingly, the "right of oblivion" -- will likely find its replication in the U.S., if not through the courts, then through the architecture of the web itself. Silly little products like Snapchat are part of that -- not just because they give us new filters to help us grow and make sense of the digital world, but because they help us to reclaim the productive limitations of the analog.

[optional image description]

And those products won't just become increasingly common; they'll also become increasingly valuable. Just as limitation itself -- through social filters, through editorial filters, through acts of extreme curation -- will likely become increasingly valuable. Just as the textual limitations of lists and the visual limitations of memes hold sway, organically, over the sharing economy of the web, we'll keep coming up with creative ways to curtail the web's impulses toward continuity. And that, in turn, will allow us to re-appropriate remembering -- not just as a passive assumption, but as a deliberative choice. And not just as an act of preservation, but as an act of love. Last week, Betaworks' social news service, launched Last Great Thing, a time-limited version of Getting the News that asks participants to share just one worthy thing they've found on the web that day -- permalinks not included. The product's point is awesomeness-without-archive. But it's also ephemerality-as-service. It allows us to do what our minds are, actually, optimized to do: to experience, to forget, to remember, and then forget again.