Last week, another distasteful use of your personal information by Google came to light: The company plans to attach your name and likeness to advertisements delivered across its products without your permission.
As happens every time the search giant does something unseemly, Google's plan to turn its users into unwitting endorsers has inspired a new round of jabs at Google's famous slogan "Don't be evil." While Google has deemphasized the motto over time, it remains prominent in the company's corporate code of conduct, and, as a cornerstone of its 2004 Founder's IPO Letter, the motto has become an inescapable component of the company's legacy.
Famous though the slogan might be, its meaning has never been clear. In the 2004 IPO letter, founders Larry Page and Sergey Brin clarify that Google will be "a company that does good things for the world even if we forgo some short term gains." But what counts as "good things," and who constitutes "the world?" The slogan's significance has likely changed over time, but today it seems clear that we're misunderstanding what "evil" means to the company. For today's Google, evil isn't tied to malevolence or moral corruption, the customary senses of the term. Rather, it's better to understand Google's sense of evil as the disruption of its brand of (computational) progress.
Of course, Google doesn't say so in as many words; the company never defines "evil" directly. But when its executives talk about evil, they leave us clues. In a 2003 Wired profile of the company, Josh McHugh noted that while other large companies maintain lengthy corporate codes of conduct, Google's entire policy was summarized by just those three words, "Don't be evil." While there's some disagreement about its origins, Gmail creator Paul Buchheit reportedly conceived of the slogan, calling it "kind of funny" and "a bit of a jab at a lot of the other companies, especially our competitors, who at the time, in our opinion, were kind of exploiting the users to some extent."
In rejoinders of Google's dubious fidelity to the motto, most assume that the company was once virtuous and has either fallen from grace, or that it has been forced to compromise its values for the market. Even ten years ago, McHugh explained the situation as a side-effect of growth, explaining how difficult it was for Google to maintain a Tron-style "fight for the users" ideal in an enormously influential global information company. Others see it as a foil. In his book The Googlization of Everything, Siva Vaidhyanathan observes that the "Don't be evil" slogan "distracts us from carefully examining the effects of Google's presence and activity in our lives." True, but the slogan itself also counts as one such activity. Understanding what evil means to Google might be central to grasping its role in contemporary culture.
In an NPR interview earlier this year, former CEO and executive chairman Eric Schmidt justified the policy with a paradigmatic example:
So what happens is, I'm sitting in this meeting, and we're having this debate about an advertising product. And one of the engineers pounds his fists on the table and says, that's evil. And then the whole conversation stops, everyone goes into conniptions, and eventually we stopped the project. So it did work.
Schmidt admits that he thought it was "the stupidest rule ever" upon his arrival at the company, "because there's no book about evil except maybe, you know, the Bible or something." The contrast between the holy scripture and the engineer's fist is almost allegorical: in place of a broadly construed set of sociocultural values, Google relies instead on the edict of the engineer. That Schmidt doesn't bother describing the purportedly evil project in question only further emphasizes the matter: Whatever the product did or didn't do is irrelevant; all that matters is that Google passed judgement upon it. The system worked. But on whose behalf? Buchheit had explained that early Googlers felt that their competitors were exploiting users, but, exploitation is relative. Even back in the pre-IPO salad days of 2003, Schmidt explained "Don't be evil" via its founders' whim: "Evil is what Sergey says is evil."
All moral codes are grounded in something: a religious tradition, a philosophical doctrine, a cultural practice. Google's take on virtue doesn't reject such grounds so much as create a new one: the process of googlization itself. If anything, Google's motto seems to have largely succeeded at reframing "evil" to exclude all actions performed by Google.
There is a persistent idea that Internet technology companies embody an innocent populism. That the rational engineer is an earnest problem-solver, his fists striking tables instead of noses. But there's something treacherous in believing that virtue and vice can be negotiated in the engineering of an email client or the creation of a spreadsheet—th
Companies like Google actually embody a particular notion of progress rather than populism, one that involves advancing their technology solutions as universal ones. Evil is vicious because it inhibits this progress. If Google has made a contribution to moral philosophy, it amounts to a devout faith in its own ability to preside over virtue and vice through engineering. The unwitting result: We've not only outsourced our email hosting and office suite provisioning to Google, but also our information ethics. Practically speaking, isn't it just easier to let Google manage right and wrong?
We can already find signs of the spread of this lesser-known, engineer's sense of evil in Wiktionary, a crowdsourced dictionary run by the group that operates Wikipedia. There, the word "evil" is revealed to have acquired a domain specific meaning in computing:
evil (computing, programming, slang) undesirable; harmful; bad practice
Global variables are evil; storing processing context in object member variables allows those objects to be reused in a much more flexible way.
Wiktionary's entry is but one specimen, but it is exemplary of Google's seemingly incongruous moral behavior. Understood in the programmer's sense, "evil" practices are just counter-indicated ones. Things that might seem reasonable in the moment but which will create headaches down the line. This kind of evil is internally-focused, selfish even: it's perpetrated against the actor rather than the public. Insofar as "bad practice" evils have victims, those victims are always members of the community of its perpetrators. Like the programmer's stock rejoinder "considered harmful", a phrase originally used to rejoin uses of GOTO in BASIC, a computational evil is one committed against engineering custom or convention.