"People love Facebook. They really love it," Biz Stone wrote earlier this month. "My mother-in-law looks hypnotized when she decides to put in some Facebook time."
She is not the only one. ComScore estimates Facebook eats up 11 percent of all the time spent online in the United States. Its users have been known to spend an average of 400 minutes a month on the site.
I know the hypnosis, as I'm sure you do, too. You start clicking through photos of your friends of friends and next thing you know an hour has gone by. It's oddly soothing, but unsatisfying. Once the spell is broken, I feel like I've just wasted a bunch of time. But while it's happening, I'm caught inside the machine, a human animated GIF: I. Just. Cannot. Stop.
Or maybe it'll come on when I'm scrolling through tweets at night before bed. I'm not even clicking the links or responding to people. I'm just scrolling down, or worse, pulling down with my thumb, reloading, reloading.
Or sometimes, I get caught in the melancholy of Tumblr's infinite scroll.
Are these experiences, as Stone would have it, love? The tech world generally measures how much you like a service by how much time you spend on it. So a lot of time equals love.
My own intuition is that this is not love. It's something much more technologically specific that MIT anthropologist Natasha Schüll calls "the machine zone."
"It's Not About Winning, It's About Getting Into the Zone"
Schüll spent more than a decade going to Las Vegas and talking with gamblers and casino operators about slot machines, which have exploded in profitability during the digital era as game designers have optimized them to keep people playing.
What she discovered is that most people playing the machines aren't there to make money. They know they're not going to hit the jackpot and go home. As Roman Mars put it in a recent episode of his awesome podcast, 99% Invisible, on Schüll's research: "It's not about winning; it's about getting into the zone."
What is the machine zone? It's a rhythm. It's a response to a fine-tuned feedback loop. It's a powerful space-time distortion. You hit a button. Something happens. You hit it again. Something similar, but not exactly the same happens. Maybe you win, maybe you don't. Repeat. Repeat. Repeat. Repeat. Repeat. It's the pleasure of the repeat, the security of the loop.
"Everything else falls away," Schüll says to Mars. "A sense of monetary value, time, space, even a sense of self is annihilated in the extreme form of this zone that you enter."
In Schüll's book, Addiction by Design, a gambler named Lola tells her: "I'm almost hypnotized into being that machine. It's like playing against yourself: You are the machine; the machine is you."
There's that word again: hypnotized, like Stone's grandmother. Many gamblers used variations on the phrase. "To put the zone into words," Schüll writes, "the gamblers I spoke with supplemented an exotic, nineteenth-century terminology of hypnosis and magnetism with twentieth-century references to television watching, computer processing, and vehicle driving."
They said things like, "You're in a trance, you're on autopilot. The zone is like a magnet, it just pulls you in and holds you there."
Why these words, these metaphors? We don't cognitively grasp the state we fall into -- we only feel its grip on us -- the way we've merged circuits with the inanimate. You are the machine; the machine is you. And it feels ... the words fail. In fact, it feels like words failing because it is at the edge of human experience, bleeding over into a cybernetic realm best expressed in data and code.
The machine zone is the dark side of "flow," a psychological state proposed by Mihály Csíkszentmihályi. In a flow state, there is a goal, rules for getting to the goal, and feedback on how that's going. Importantly, the task has to match your skills, so there's a feeling of "simultaneous control and challenge."
In a 1996 Wired interview, Csíkszentmihályi described the state like this: "Being completely involved in an activity for its own sake. The ego falls away. Time flies. Every action, movement, and thought follows inevitably from the previous one, like playing jazz."
Schüll sees a twist on this phenomenon in front of the new slot machines of Vegas, which incorporate tiny squirts of seeming control to amp up their feedback loops. But instead of the self-fulfillment and happiness that Csíkszentmihályi describes, many gamblers feel deflated and sad about their time on the slots.
The games exploit the human desire for flow, but without the meaning or mastery attached to the state. The machine zone is where the mind goes as the body loses itself in the task. "You can erase it all at the machines," a gambler tells Schüll. "You can even erase yourself."
You can get away from it all in the machine zone, but only as long as you stay there.
The Facebook Zone
When we get wrapped up in a repetitive task on our computers, I think we can enter some softer version of the machine zone. Obviously, if you're engaged in banter with friends or messaging your mom on Facebook, you're not in that zone. If you're reading actively and writing poems on Twitter, you're not in that zone. If you're making art on Tumblr, you're not in that zone. The machine zone is anti-social, and it's characterized by a lack of human connection. You might be looking at people when you look through photos, but your interactions with their digital presences are mechanical, repetitive, and reinforced by computerized feedback.
I'm not claiming that people are "addicted" to Facebook. Some of the gamblers quoted in Schüll's research do in fact have serious problems. But I am using their stories as Schüll did -- as sources of expertise on the zone, not to say their experience with slot machines is exactly like your average user's time on Facebook.
I point this out because there is a tendency to toss around the idea of addiction to various technologies like it's no big deal. But it is.
All of this to say: I'm not making an argument about the totality of services like Facebook. This is a criticism of specific behavioral loops that can arise within them.
The purest example of an onramp into the machine zone is clicking through photo albums on Facebook. There's nothing particularly rewarding or interesting about it. And yet, show me the Facebook user who hasn't spent hours and hours doing just that. Why? You can find the zone. Click. Photo. Click. Photo. Click. Photo. And perhaps, somewhere in there, you find something cool ("My friend knows my cousin.") or cute ("Kitten."). Great. Jackpot! Click. Photo. Click. Photo. Click. Photo.
Facebook is the single largest photo sharing service in the world. In 2008, when the site had 10 billion photographs archived, users pulled up 15 billion images per day. The process was occurring 300,000 per second. Click. Photo. Click.
In 2010, Facebook had uploaded 65 billion images, and they were served up at a peak rate of 1 million per second. By 2012, Facebook users were uploading 300 million photos per day. And early this year, Facebook announced users had entrusted them with 240 billion photos.
If we assume the ratio of photos uploaded to photos viewed has not declined precipitously, users are probably pulling up billions of Facebook photos per day at a rate of millions per second. Click. Photo. Click.
It all adds up to a lot of time spent in the loop. According to a 2011 ComScore report, users spend 17 percent of their time on the site exclusively browsing photos (which as Inside Facebook notes, doesn't include "time spent reading news feed stories and notifications generated by photo uploads").
To put these numbers in perspective, ComScore's 2013 Digital Focus report found that Facebook took 83 percent of the time spent on *all* social networks on the web. That means that of all the time spent on social networks, 14 percent of it occurs within this one behavioral loop. That's more than all the time spent on Tumblr, Pinterest, Twitter, and LinkedIn combined!
If all technological artifacts contain certain "prescriptions" within them, if designers can inscribe intentions into the things they build, as in sociologist Bruno Latour's theory, then we can say that some engagement mechanisms are more prescriptive than others.
What Facebook and slot machines share is the ability to provide fast feedback to simple actions; they deliver tiny rewards on an imperfectly predictable "payout" schedule. These are coercive loops, distorting whatever the original intention of the user was. What began as "See a picture of person X" becomes "keep seeing more pictures." The mechanism itself becomes the point.
Slot-game designers, for their part, have had to grapple with the ethical issues raised by exploiting the machine zone. And that grappling hasn't been pretty.
Schüll talks about one designer, Randy Adams. At first, he tells her that he's "morally" opposed to being machines that enable compulsive behavior, which is an acknowledgement that it's possible to do so. "But on this point Adams was not consistent," she writes. "[Adams] began by locating addiction within the person, stating that 'some people can't control the part that turns it from fun into addiction.' When pressed to specify 'the part that turns it from fun into addiction,' he replied: 'It's the design of the game," and then added that this characteristic of design was "not intentional on our part, just the way it happened to evolve.'"
What would it mean for the project of social media if we understood it to induce similar psychological states to machine-based gambling? Would Silicon Valley employees struggle with their product the way slot-machine designers do? I know a lot of coders and people who've worked for various social companies; they certainly don't see themselves as being in the same core business as a casino. Most of them think they're "doing well by doing good."
As a thought experiment, imagine there were incontrovertible proof that certain web service designs caused people to enter the machine zone, quadrupling time on site for a subset of users. Would designers outlaw their use or would they all deploy the tricks for their startups?
Things could be different. A site could encourage a different ethic of consumption. To be a little absurd: Why not post a sign after someone has looked through 100 pictures that says, "Why not write a friend or family member a note instead?"
Shouldn't these things be part of what web companies think about? Not just encouraging users to consume more and more, but helping them stop.
The Problem of "Giving People What They Want"
You could argue that designers are simply giving the people what they want. The data says people spend a lot of time looking at pictures; so, Facebook serves up the pictures. Simple as that.
Engagement is usually the currency of the social network realm. Since it's much harder to measure whether someone is actually enjoying an experience than it is to measure the number of minutes someone spends doing it, engagement is typically measured by time. And so, Silicon Valley has made the case to itself (and to the users of its software) that we are voting with our clicks.
But there's a problem. A definition of "what people want" got smuggled in with the data. The definition starts logically: People go to sites they like. But then it gets wobblier. They say that the more time you spend on a site or part of a site, the more you like it. Of course, that completely elides the role the company itself plays in shaping user behavior to increase consumption. And it ignores that people sometimes (often?) do things to themselves that they don't like. Who "likes" spending hours flipping channels -- and yet it's been a core part of the American experience for decades.
What if the 400 minutes a month people spend on Facebook is mostly (or even partly) spent in the machine zone, hypnotized, accumulating ad impressions for the company?
Here's my contention: Thinking about the machine zone and the coercive loops that initiate it has great explanatory power. It explains the "lost time" feeling I've had on various social networks, and that I've heard other people talk about. It explains how the more Facebook has tuned its services, the more people seem to dislike the experiences they have, even as they don't abandon them. It helps explain why people keep going back to services that suck them in, even when they say they don't want to.
It helps me understand why social media, which began with the good intention of connecting people, has become such a fraught subject. Among the tech savvy, it is seen as an act of bravery to say, "I love Facebook."
Because designers and developers interpreted maximizing "time on site," "stickiness," "engagement," as giving people what they wanted, they built a system that elicits compulsive responses from people that they later regret.
At the very least, the phenomenon of the machine zone has to become a part of the way we talk about the pleasures of the Internet. Perhaps, over the long run, these problems will self-correct. I'm not so sure, though: The economic forces at the heart of ad-supported social networks basically require maximizing how much time people spend on a site, generating ad impressions.
It just so happens that the user behavioral patterns that are most profitable for Facebook and other social networks are precisely the patterns that they've interpreted to mean that people love them. It's almost as if they determined what would be most profitable and then figured out how to justify that as serving user needs.
But I actually don't believe that. You can say many things about the entrepreneurs, designers, and coders who create social networking companies, but they believe in what they do. They're more likely to be ideologues than craven financial triangulators. And they spend all day on Facebook, Instagram, Twitter, Tumblr, and Pinterest, too. I bet they know the machine zone, too. And that's why I have hope they might actually stop designing traps.
In any case, fighting the great nullness at the heart of these coercive loops should be one of the goals of technology design, use, and criticism.
In the great tradition of the Valley, we'll make a t-shirt: Just Say No To The Machine Zone.