What Facebook and Google Can Learn From the First Major News Hoax
The original American penny press told readers that horny bat-people lived on the moon. The year was 1835. Even in 2017, its lessons are more relevant than ever.
In the age of the platform, how can anybody be sure that the news they read is true?
Last week, lies, hoaxes, and rumors prevailed on Facebook and Google following the mass murder in Las Vegas. The sites promoted stories claiming that the killer was a Rachel Maddow fan, or an ISIS follower—both false. As gatekeepers of valuable information, the platforms failed, The Atlantic’s Alexis Madrigal wrote. This was a familiar story. After the 2016 election, Russian propaganda and false memes on Facebook elevated “fake news” in the national lexicon. Today, with two-thirds of Americans getting their news from editor-free social media sites, the veracity of news stories is, itself, a major news story.
For Facebook and Google, the rise of misinformation is a Frankenstein monster unleashed by their own technology. How can these companies rein in their dangerous creation? The briefest and most honest answer today is it won’t be easy. But some key lessons of the intractability of bad information, and how to defeat it, dwell in the story of one of America’s first major media hoaxes, nearly 200 years ago.
A new era of the news industry began September 1833, when a 23-year-old named Benjamin Day founded the New York Sun. Newspapers of the time were an elite product, selling at the relatively high price of 6 cents. Day’s innovation was to slash the price to a penny, attracting a larger readership of working class readers, and then (this was the genius part) to sell those customers to advertisers. In other words, the readers—long before the era of cookies and tracking and data collection—became the product. The penny press, like the social network, was a new information technology relying almost entirely on advertising, revolutionizing the news business.
In August 1835, the Sun ran a news story claiming that a famous scientist, Sir John Herschel, had glimpsed, through a telescope, men gallivanting on the surface of the moon. Covered in reddish hair, and outfitted with large wings, they soared over blue bodies of water, cut by dramatic canyons. This was not satire, nor was it obvious fiction. The story was designed to look like a reprint of a paper that had been published in the Edinburgh Journal of Science. It was a lie, wearing the costume of a prestigious scientific journal. As Tim Wu wrote in his book The Attention Merchants, “Benjamin Day had invented ‘fake news’ and demonstrated its appeal.” Eighteen decades later, the prevalence of online trolling, abuse, violence, and other attention-seeking behavior shows that the country is still grappling with the inherent risks of ad-supported platforms for attention.
The entire moon hoax was six installments and 17,000 words long. Hardly literature, the piece was crafty about manipulating its readership. The first installment, mentioning no bat boys or lunar canyons, simply established the premise that a scientist named Herschel, working from the Cape of Good Hope, had built a telescope powerful enough to get a rodent’s-eye view of the moon’s surface. In the second installment, the author revealed the discovery of silvery-blue bison. The third installment introduced beavers living in multistory huts. The fourth revealed humanoid creatures with the face of an orangutan and the membranous wings of a bat, who reportedly partook in a great deal of open-air sex. In the final installments, readers learned of a higher order of bat-people who lived near a sapphire temple.
The initial response was pure astonishment, only slightly mitigated by skepticism. Belief was nearly absolute throughout the country, even among faculty at prestigious universities like Yale. The city’s most famous skeptics were perhaps staff of the rival New York Herald, who did not appreciate the Sun’s success. Word of bat-people copulating in the lunar jungles of our satellite spread throughout the world—one of the first “viral” news stories. Its speedy propagation was aided by a new broadcast technology, the steam-powered printing press, whose greater productivity facilitated the popularity of any information, no matter its veracity.
The hoax was only exposed when several rival papers and journals, led by the Herald, presented overwhelming evidence of its impossibility. There were a few dead giveaways, for example the fact that the Edinburgh Journal of Science no longer existed. A month later, the Sun admitted that the whole thing was made up.
Despite the hoax, Benjamin Day’s newspaper was an important journalistic innovation, bringing news to the working class. Advertising is intrinsically democratizing, since it subsidizes readers who cannot afford the full cost of information gathering. But human attention is a fickle mercenary, one who doesn’t always gravitate to the finest cause. People’s automatic attention tends toward the outrageous, not the civically valuable, making popularity in attention markets a poor gauge of truth.
According to Tim Wu, a taste for outrageousness, and a casual attitude toward the truth, is a natural tendency of any advertising platform. Even when the Sun’s news items were more terrestrially conservative, the penny press still bloated with medical scams promising full heads of hair and cures for impotence. “By demonstrating that a business could be founded on the resale of human attention, Day and his competitors became the first attention merchants,” Wu wrote. “[They] also, within just five years, discovered the public’s weakness for death and violence, incessant trolling, and, finally, fake news.”
Facebook and Google in the 21st century are novel platforms for human attention, much like the Sun and steam-powered penny press of the 1830s, but without dedicated editors and writers. What lesson should today’s attention merchants draw from America’s first news hoax?
First, intelligence is no match for gullibility. After all, university scientists believed the Sun’s reports of lunar bat-people. Tens of millions of Facebook readers saw Russian propaganda on social networks. Facebook’s current solution to the flood of fake news is to fight disinformation with context. For example, it is experimenting with tagging dubious articles to alert readers.
But this might not be enough. The sheer presence of fake news in the information ecosystem will inevitably make some people believe it, no matter what the label is. That was the conclusion of a new paper by Yale University researchers Gordon Pennycook, Tyrone D. Cannon, and David G. Rand. They showed participants false headlines as they might appear on Facebook. Some of the articles were labeled as “contested by fact checkers.” But it hardly made a difference. “Tagging such stories as disputed is not an effective solution to this problem,” they wrote, because merely seeing a news story increases its perception of veracity. According to the mere-exposure effect, one of the oldest and most validated discoveries in psychology history, people have a preference for familiar shapes and ideas, particularly when they aren’t exactly sure why those stimuli are familiar. (If, for example, one is forced to watch 40 Pepsi ads in a row, the effect is attenuated.) In short, even dubious claims become more believable with repeated exposure. This is a critical problem for a social media site like Facebook, whose algorithm rewards clicks and is less adept at fact-checking.
Second, marketplaces of attention gravitate toward fiction when there are no safeguards. Fictional movies outsell documentaries. Novels outsell nonfiction titles. Facts can be a straitjacket for storytellers. So it’s hardly surprising that, while editors and reporters are concerned with facts, content platforms like Facebook and Google—who don’t pay reporters and replace editors with algorithms—will struggle to ensure that their users are always wearing their straitjacket. Mark Zuckerberg has claimed that connecting the world serves a dual purpose of unifying humanity and being a good business. But the recent experience of Facebook, juxtaposed with the story of Day’s moon hoax, suggests that wherever people create new platforms for human attention, some people will see that platform as an opportunity to spread propaganda, sensational lies, and fake news.
Third, the solution to fake news isn’t better technology; it’s better people. The antidote to the New York Sun’s baser tendencies didn’t come from the Sun, or the tabloids it inspired decades later. The “fix” to fake news, as it were, came from The New York Times, The Wall Street Journal, and other newspapers and magazines founded by editors who were more devoted to the truth. They combined Day’s dual-revenue model with a reverence for facts. In the last 150 years, many news organizations—including magazines like this one—have showed that it is possible to accept advertising without descending into a nihilistic war for attention at all costs.
For Tim Wu, the most important takeaway from America’s first big news hoax is simple, yet hard. “The solution is ethics,” Wu told me. “They play an invisible role in keeping our world sane. But Facebook has shown a remarkable insensitivity to this point. I think that itself is unethical.” Facebook and Google have built something extraordinary, yet inscrutable—a virtual infrastructure for connecting the media-making world. But in any marketplace of attention, there will be media makers who purely seek attention for its own sake and for whom the responsibility to be truthful is, at best, secondary, if it exists at all. Facebook and Google have not yet showed that they can build lasting safeguards against this tendency.
But that doesn’t make the task impossible. The platforms could hire fact-checkers—thousands of them. Rather than amplify any content that is going viral, they could amplify trusted news sources and throttle news that fails to meet a certain standard of veracity. There is no question that these changes could raise hell from some publishers, just as surely as others would welcome them. Like a newspaper, an algorithm is only as good as the people who write it.