The Tragedy of the Digital Commons
Advocates for fairer, safer online spaces are turning to the conservation movement for inspiration.
When her husband lost his job in 2010, Kristy Milland realized how important the Internet had become to her family's survival. For several years, the 30-something Canadian high-school graduate had a hobby of completing paid micro-tasks on Amazon's Mechanical Turk, an online marketplace that sells crowdsourced labor. She answered surveys, tagged images, and trained artificial intelligences for a few cents or dollars a task. In time, Milland became community manager of TurkerNation, one of several major forums for worker discussion and peer support.
With bills looming, Milland realized, "I had to turn this into a real gig." Now that her digital work was her family's primary income, she felt for the first time how hard it was to make ends meet. Mechanical Turk, which has no minimum wage, is a free market for digital labor. Only the collective decisions of workers and requesters determine wages and working conditions. Since Amazon provides no way for workers to rate employers, workers can’t always anticipate if they will be treated well or paid fairly. As a result, making a living on Mechanical Turk is a precarious venture, with few company policies and a mostly hands-off attitude from Amazon.
Milland and other regular Turkers navigate this precariously free market with Turkopticon, a DIY technology for rating employers created in 2008. To use it, workers install a browser plugin that extends Amazon's website with special rating features. Before accepting a new task, workers check how others have rated the employer. After finishing, they can also leave their own rating of how well they were treated.
Collective rating on Turkopticon is an act of citizenship in the digital world. This digital citizenship acknowledges that online experiences are as much a part of our common life as our schools, sidewalks, and rivers—requiring as much stewardship, vigilance, and improvement as anything else we share.
“How do you fix a broken system that isn't yours to repair?” That’s the question that motivated the researchers Lilly Irani and Six Silberman to create Turkopticon, and it’s one that comes up frequently in digital environments dominated by large platforms with hands-off policies. (On social networks like Twitter, for example, harassment is a problem for many users.) Irani and Silberman describe Turkopticon as a “mutual aid for accountability” technology, a system that coordinates peer support to hold others accountable when platforms choose not to step in.
Mutual aid accountability is a growing response to the complex social problems people face online. On Twitter, systems like The Block Bot and BlockTogether coordinate collective judgments about alleged online harassers. The systems then collectively block tweets from accounts that a group prefers not to hear from. Last month, the advocacy organization Hollaback raised over $20,000 on Kickstarter to create support networks for people experiencing harassment. In November, I worked with the advocacy organization Women, Action, and the Media, which took a role as "authorized reporter" with Twitter. For three weeks WAM! accepted reports, sorted evidence, and forwarded serious cases to Twitter. In response, the company warned, suspended, and deleted the accounts of many alleged harassers.
These mutual aid technologies operate in the shadow of larger systems with gaps in how people are supported—even when platforms do step in, says Stuart Geiger, a Berkeley Ph.D. student. In other words, sometimes a platform’s system-wide solutions to a problem can create their own problems. For several years, Geiger and his colleague Aaron Halfaker, now a researcher at Wikimedia, were concerned that Wikipedia’s semi-automated anti-vandalism systems might be making the site unfriendly. As a graduate student unable to change Wikipedia’s code, Halfaker created Snuggle, a mutual-aid mentorship technology that tracks the site’s spam responders. When Snuggle users think a newcomer’s edits were mistakenly flagged as spam, the software coordinates Wikipedians to help those users recover from the negative experience of getting revoked.
By organizing peer support at scale, the designers of Turkopticon and its cousins draw attention to common problems, hoping to influence longer-term change on a complex issue. In time, the idea goes, requesters on Mechanical Turk might change their treatment of workers, Amazon might change its policies and software, or regulators might set new rules for digital labor. This is an approach with a long history in an area that might seem unlikely: the conservation movement. (Silberman and Irani cite the movement as inspiration for Turkopticon.)
To better understand how this approach might influence digital citizenship, I followed the history of mutual-aid accountability in a precious common network that the city of Boston enjoys every day: the Charles River. Planned, re-routed, exploited and contested, it has inspired and supported human life since before written history.
As early as 3200 B.C. and continuing for over 1,500 years, Native Americans re-routed the flow of water near Boston to catch fish in constructions that covered over two acres. The food and fertilizer supplied a sizable community until rising water levels made their economy unsustainable. Colonial dams and bridges were constructed on the Charles from the 1640s, and Harvard University was partly funded by ferry and bridge tolls for nearly 200 years. Across this river, Paul Revere received covert optical transmissions about British military movements from Old North Church. Two months later, British warships would sail its waters in an attempt to capture Bunker Hill. In the 19th century, Henry Wadsworth Longfellow crossed it to see his sweetheart Frances Appleton, writing of the Charles River:
As long as the heart has passions,
As long as life has woes;
The moon and its broken reflection
And its shadows shall appear,
As the symbol of love in heaven,
And its wavering image here.
Longfellow’s poem didn't mention the pollution from 43 mills along the riverbanks that prompted the government to abandon the idea of cleanup efforts in 1875. Absent from his poem, too, are the chemical spills from the Watertown Arsenal, later designated a superfund site by the Environmental Protection Agency, or the municipal sewage systems that fed directly into the river. Nor was this problem solely created by institutions. In the 1950s, when the river's toxic pink and orange waters were closed to swimmers, Bernard DeVoto described an informal landfill along the river in Harper's Magazine as “Hell's Half Acre.” Urging Bostonians to take action, DeVoto lamented that the river had become “foul and noisome, polluted by offal and industrial wastes, scummy with oil, unlikely to be mistaken for water.”
When the ecologist Garrett Hardin set to write his famous 1968 article on problems with “no technical solution,” “The Tragedy of The Commons,” he could have been describing the Charles. Hardin imagines open grazing areas managed by multiple herders who destroy their precious common when each rationally seeks to maximize personal gain. The problems of digital labor can also be interpreted through this tragedy. With a Mechanical Turk worker turnover rate of 69 percent every six months, requesters tend to seek the minimum price for someone’s labor, and workers compete for diminishing pay. With minimal accountability for the companies requesting work and limited intervention from Amazon, attractive stories of flexible, livable income from digital labor remain as partially true as Longfellow’s poetic image of the beautiful river Charles.
Academics advancing the idea of digital commons have tended to focus on how to prevent or regulate these problems—after they're identified. In Code and Other Laws of Cyberspace, Larry Lessig describes software design as a kind of regulation separate from top-down policies or community norms. Sixteen years after Lessig’s book, belief in the power of code and social psychology to shape successful online communities is widespread among the design teams who govern our digital lives. Their growing toolbox of design options is detailed in a recent law review article by James Grimmelman, who covers everything from banning and shaming to reputation and rewards. In this view, perhaps Mechanical Turk could become fairer if Amazon added the right buttons, set the right default wage, or changed its design to activate just the right motivations.
If code is law online and platform designers are its legislators, who identifies the problems and sets the goals for those laws? Throughout her career, the Nobel Prize-winning economist Elinor Ostrom studied the successes of monitoring programs run by the communities connected to the resources they share. Like Hardin, Ostrom saw no single technical or legal solution to the complex problems of common resources. Yet in place of Hardin's selfish freeloading herders, Ostrom described a cooperative and “co-evolutionary race,” a struggle among the frenemies who share common spaces. Ostrom observed that in well-managed common resources, monitoring is part of wider systems to hold each other in check for the common good. On the Charles River, this monitoring keeps people safe while also supporting long-term change.
Two summers ago, officials declared the Charles River safe for public swimming after a 50-year ban. Throughout that period, boaters kept each other safe through community monitoring. Every summer morning, boathouses along the river raise colored flags to signal the water's estimated bacteria level. A red flag signals high levels of of e. coli, a blue flag means a river safe for boating, and a yellow flag warns of inconclusive statistical predictions. Behind each morning’s colored flag is the story of a decades-long struggle among citizen groups, scientists, planners, local companies, and government to reverse the tragedy of the Charles.
Where Turkers click buttons to rate employers, the river’s users dunk bottles to monitor water quality. Every month, teams of volunteers on bridges, banks, and boats drop open bottles into the Charles River at 30 locations along its 80-mile length. After capping the bottles and taking notes on river flow and the weather, volunteers pass samples to the Charles River Watershed Association (CRWA), a ritual they have maintained for 20 years. CRWA staff scientists analyze the samples and add the data to statistical models they developed to predict river bacteria. By using a statistical model of weather and river temperature, CRWA can offer daily predictions of boating safety. When conditions are less predictable, the boathouses that “publish” this data-visualization fly a yellow flag to signal that uncertainty. Data from the samples is also used to hold polluters accountable and advocate for change.
Successful mutual aid doesn't guarantee wider change. The creators of Turkopticon sometimes worry that patching up problems for only some people might make wider change even harder to achieve. Seven years later, Amazon still hasn't added work-requester accountability to their platform. (A spokesperson told me that “our customers are telling us this feature is valuable and we’re looking at ways in which we can offer it.”) For now, new workers are expected to find and use Turkopticon on their own. On the Charles River, however, mutual aid accountability has been key to its transformation.
“Our policy work is based on our science,” says Margaret Van Deusen, the director of Charles River Watershed Association's law, advocacy, and policy work. Cleaning up a river requires careful monitoring to identify sources of problems and judge the effectiveness of new cleanup ideas. Founded in 1965, the CRWA has cajoled and supported federal and local government to clean up pollution along the river, including a military research lab, hospital waste, and sewer systems. According to Elisabeth Ciancola, the aquatic scientist who manages the citizen monitoring program, citizen data from all those bottles helped CRWA successfully advocate to close sewer runoffs along the river, reducing pathogen-rich overflows by 98 percent.
Online, though, it is hard to bottle a representative sample of a common problem. When monitoring a river, volunteers can take samples at carefully-specified locations without trespassing or violating privacy. Those samples can be analyzed using standardized methods with verified accuracy. In contrast, mutual-aid accountability systems like Turkopticon are dependent on those who use them and on the subjective judgments of the people who provide mutual aid. “Turkopticon's intents are great,” says Kristy Milland, the digital worker in Canada. Ratings are “a good suggestion but not necessarily 100 percent accurate.”
This subjective knowledge might be an advantage, says Stuart Geiger, the researcher who studies Wikipedia. Quoting the science and technology scholar Donna Haraway, Geiger says that “situated knowledge” from the people facing a problem can give us “a more adequate, richer, better account of a world, in order to live it well.” Geiger's own Wikipedia research merges this situated knowledge with quantitative methods that are designed to offer a representative understanding of behavior on the site.
But citizen monitoring is only effective at wider change when monitoring groups are able to convince powerful entities to take them seriously. Building on their citizen science, the CRWA calculated a “pollution budget” of how much the river could handle, an idea that became part of state pollution laws in 2011. The metric for “total maximum daily load” gives advocates an opportunity to educate planners on the consequences of their proposals and hold them accountable for those consequences, says Van Deusen, the CRWA director. Other situations require powerful allies. In a recent controversy over medical waste from an abandoned hospital, the CRWA and a local city council successfully pressured the state to expand its cleanup measures.
Workers on Mechanical Turk have moved from mutual aid to organized advocacy in the last year. Turkers have a lively history of online conversations across multiple forums, Facebook groups, subreddits, and anonymous chat rooms, where they discuss work and share software like Turkopticon. When the Stanford Ph.D. student Niloufar Salehi went looking for ways to help, she was initially stumped. “One of the biggest challenges was to figure out what they didn’t have,” she said in an interview.
Salehi soon found that academic researchers are creating some of workers’ greatest problems, sending high volumes of unpaid and under-paid tasks with surveys, social experiments, and studies of Turkers themselves. Academics often reason that the public good from research justifies the poor treatment and disregard they give digital workers. In one case, an economics professor attempted a deception-based study that added fake information into Turkopticon itself. Workers quickly detected the project and shut it down. One study on ethics paid workers $1.50 for a 39-minute task. These problems are so common that Rochelle LaPlante, a moderator for two Turker discussion boards, tweets out daily examples of best and worst researcher ethics.
Last year, workers started a digital campaign to develop ethical guidelines for academics, drafting them on Dynamo, a deliberation website that Salehi created. The document had been signed by over 150 workers and 50 academics. When some of these workers see academic tasks they consider unfair, they now send researchers a link to the guidelines and insist that they sign.
Early success at monitoring and changing research ethics has motivated some Turkers to appeal for changes by Amazon. In December, Kristy Milland and other turkers used Dynamo to organize a Christmas letter to Jeff Bezos asking him to improve working conditions, not just for North American workers like her, but also for Indian workers who consistently reported late paychecks. According to Milland, the company switched to paying Indian workers by direct deposit this spring, a detail Amazon confirmed to me. It wasn't everything they asked for, but it was a start.
Few discussions of the Internet take place on generational scales. Created only five years ago, Randall Munroe’s Map of Online Communities, above, already looks as foreign as maps of pre-colonial Boston. But Geiger, who recently called on designers and social scientists to ethically embrace their role as the web's “civil servants,” believes that enduring progress is possible among the perpetual disruption of digital ventures. “Even if Twitter or Mechanical Turk or Wikipedia die in 10 years and something new replaces all of them, we're still going to have these issues, and the people who are tackling those issues will still care about them.”
What might it mean for digital citizens to play a greater role in the long term operation of online platforms? In Europe, lawmakers and courts have a history of regulating the details of algorithms like Google search. Another idea is a Magna Carta for “consent of the networked,” according to the journalist and anti-censorship advocate Rebecca MacKinnon. This idea, backed by the web's creator Tim Berners-Lee, might bind platforms to the consent of their users, even when companies span multiple countries and jurisdictions. One example of this might be the Wikimedia Foundation, which reserves half of its board positions for elected Wikipedians. Wikimedia also leaves many governance details to its community in each of its language groups, like a federal government comprised of many states.
Managing a commons is more complex than users versus platforms. In cases like Mechanical Turk, Amazon helps its users hold each other accountable by sharing data with systems like Turkopticon. Perhaps similar data sharing could help researchers and citizen groups audit algorithms from the outside. Nor does this work need to happen entirely outside platforms. Public research like Facebook's recent study on political bias helps the public understand and debate the state of our shared digital lives.
Whatever approach takes hold, digital citizens are playing a critical role. In 2012, Kristy Milland's family found more stable income. She's now finishing a university degree. No longer clicking buttons 17 hours a day, she has found more time to support digital labor rights. This spring, she joined other digital workers on a University of California Santa Cruz and Stanford project to develop a fairer crowdsourcing marketplace.
Milland is also optimistic about Amazon, whose platform remains a dominant source of digital work. Encouraged by workers’ participation in the Christmas letter campaign, she and other Turkers are eager to try new campaigns. “Turkers are humans,” she says, “if they understand we're human, they will treat us better.”
Mutual-aid accountability is one way that digital citizens like Milland take up the same kind of stewardship of our online lives that we give our schools, sidewalks, and rivers. Online, it's easy to point out problems, just like DeVoto lamented “Hell's Half Acre.” But sustained action over the common good can have surprisingly far-reaching outcomes.
Last October, during secret climate negotiations with China, U.S. Secretary of State John Kerry went to lunch overlooking the Boston Harbor with a top Chinese environment official, Yang Jiechi, telling him the story of the Charles River cleanup. Kerry’s lunch with Yang was described by The Washington Post as a “key moment” in climate negotiations, leading to a groundbreaking deal between the world’s top carbon polluters.
The Charles River had become the symbol of a different kind of love than Longfellow imagined, an active hope maintained by citizens for over 60 years: to make the river as swimmable as it is beautiful.