Anonymity Is Having a Moment

But when is it justified?

Andrew Harrer / Getty

Within newsrooms, the question of when it’s appropriate to use anonymous sources is frequently debated. It’s one of the many kinds of conversations—crucial and complicated—that’s not typically visible to the public. But recently, anonymity has bubbled over into national conversation, too, amplified by a president who frequently condemns it.

There was the anonymous New York Times op-ed by a senior White House official. The publication of Bob Woodward’s Fear. The anonymously sourced New York Times story that could potentially lead to Rod Rosenstein’s departure from the Justice Department. In a November 1925 Atlantic essay, the British novelist E. M. Forster grappled with this very subject: Anonymity, he argued, is best suited for fiction—and dangerous in newspapers.

Forster began his essay by presenting the two, sometimes overlapping, functions of words: to “give information” or “create atmosphere.” While the former applies to everything from stop signs to newspapers, Forster wrote, the latter encompasses the wide realm of literature. Poetry has “absolutely no use,” the novelist claimed. But fiction has use in its efforts to convey deep truths about the world; with fiction, the reader suspends “ordinary judgments” and enters “a universe that only answers to its own laws, supports itself, internally coheres, and has a new standard of truth.” Forster contended that there was no need for authorship in this universe:

The argument that words matter more than their writer has a long history, in journalism as well as fiction. In the mid-19th century, most magazines, The Atlantic included, didn’t publish bylines at all. “The names of contributors will be given out when the names are worth more than the articles,” the Atlantic co-founder Ralph Waldo Emerson once declared. Less than 10 years later, that policy changed: Starting in 1862, the magazine’s second editor, James T. Fields, published authors’ names semi-annually. In 1870, the magazine began printing names at the end of each article, part of Fields’s effort to commercialize The Atlantic’s authors. But even in 1925, when Forster wrote his essay in The Atlantic, it was still standard for newspaper writers to forgo bylines, aside from a few first-person pieces. Though the practice of anonymously authored nonfiction persists in some publications—notably The Economist—the majority of published material today includes a byline, chiefly for the sake of transparency.

Forster, too, valued transparency: “It seems paradoxical that an article should impress us more if it is unsigned than if it is signed. But it does, owing to the weakness of our psychology,” he wrote. “Anonymous statements have, as we have seen, a universal air about them. Absolute truth, the collected wisdom of the universe, seems to be speaking, not the feeble voice of a man.” According to Forster, “The modern newspaper has taken advantage of this.” Literature could exist as its own anonymous “absolute truth.” But information, Forster argued, “is relative,” true only if it is accurate, and consequently in need of attribution.

Ironically, Forster’s Atlantic diatribe against anonymity was directly followed by an anonymous essay, written by a man who didn’t want his family to know he was dying of cancer (at the time, the stigma against the disease frequently meant people used euphemisms when discussing it, if they talked about it at all). Though Forster didn’t have a chance to comment on the piece, the editors of The Atlantic did, noting the juxtaposition of the articles and lauding the way the anonymous man “describes his attitude toward Death, and his ordering of what remains of life, with such knowledge and courage as compose ‘absolute truth.’” As in many unsigned pieces The Atlantic has published over the years—a World War I deserter writing on his decision to leave, a 1940s Jewish man on changing his last name to something less Jewish, a 1960s woman on getting an abortion—the anonymous man in 1925 was first and foremost telling a personal story; like literature, it could exist as an absolute truth.

The Times op-ed, by contrast, can hardly be construed as personal. In a breakout article prompted by nearly 23,000 reader questions, the Times’ op-ed editor remarked that the senior official’s piece intends “to describe, as faithfully as possible, the internal workings of a chaotic and divided administration and to defend the choice to nevertheless work within it.” According to Forster’s rubric, then, the op-ed writer has fallen short by failing to be accountable for the information he or she provides, and instead takes advantage of the credibility implied by anonymity’s “universal air.” The writer, even if unintentionally, blurs the boundaries between fact and fiction and delivers unsubstantiated absolutes—practices that echo those of the very man he or she is critiquing.

We can imagine how differently any of the recent anonymously sourced information would have been received had “the feeble voice of a man” been public knowledge. “The man who gives [information] ought to sign his name, so that he may be called to account if he has told a lie,” Forster wrote. “Make your statement, sign your name. That’s common sense.”