At 6:25 a.m. Pacific Time on Monday morning, a 4.4-magnitude earthquake hit near Los Angeles. By 6:33, the Los Angeles Times became one of the first outlets to report the quake online. But it turns out they had a little help.
Database producer Ken Schwencke got the byline, but an interesting footnote told a different story:
This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author.
Yes, a programmed bot helped prepare Schwencke's story for him. That was thanks to the Quakebot, an algorithm created by Schwencke that automatically takes any earthquakes over a 3.0 magnitude reported from U.S. Geological Survey and turns the information into a text story, adds a map, and sets a headline in the Times content management system.
This morning, two minutes after the earthquake struck, the USGS Earthquake Notification Service sent out the details of the earthquake and its location and strength to anyone listening. At that same moment, Schwencke got an email: a story on the earthquake was ready to be published. Though Schwencke gets the official byline, the Quakebot does the dirty work. Indeed, the biggest delay in the story going live was the time taken for Schwencke to roll out of bed, turn on his computer, double check the Quakebot's accuracy, and press publish.
The Wire spoke with Schwencke to learn more about his earthquake reporting tool, which he said came after Japan's massive tsunami-causing earthquake back in 2011. "It's been a great thing. It really alleviates some of the grunt work of reporting on this stuff," Schwencke told The Wire. Active earthquake data (size, location, time) from around the world is available in nearly real-time on the USGS website. As soon as they report an earthquake measuring 3.0 magnitude or more, the Quakebot pings Schwencke, other editors, and copy editors that a story is written, in its CMS, and ready for publishing. All they have to do is proofread, fact check, and set it live.
Though the majority of Quakebot's posts go unnoticed nationally, when an earthquake of this magnitude hits a major city like Los Angeles, the bot has a clear benefit in informing the audience. "It's been a very helpful tool I think to get up something as soon as possible and let people focus on finding out what's actually happening outside of the basic information," Schwencke said. The bot is helpful, too, in getting some traffic. "When you have a situation like that and you can in five minutes have something up there for people to Google and find ... I haven't checked but I imagine that was a very, very, very popular post on the website today," Schwencke said.
Similar productions exist elsewhere online, including the Twitter account LA QuakeBot, the creation of programmer Bill Snitzer.
These posts have a simple premise: take small, factual pieces of data that make the meat of any story, and automatically format them into a text-driven narrative. It's nothing that a regular reporter wouldn't search for in the course of reporting a story. "You can write code that will ask and answer the common questions that a reporter would ask when looking at that same dataset," LA Times database producer Ben Welsh told Journalism.co last year.
It's just one of several bots that the LA Times uses to produce stories. The site also automates the opening sentence for its Homicide Report, a story for every homicide in the Los Angeles area, as Journalism.co detailed. Another bot sends a daily email of the LAPD's arrests, alerting journalists to any high-profile arrests, such as ones with particularly high bail or with newsworthy occupations.
Other similarly automated stories would be just as simple, especially for data-heavy news like the monthly employment numbers, sports results, or company IPO filings. The idea of using automatic computer programs to craft stories is not new — all you need is a set of facts and few rules about sentence structure — but few large news organizations have actually resorted to using them. (Possibly out of fear of angering their actual human employees.) Indeed, a recent study found that most readers can't tell the difference between a human-written story and computer-written one for these mundane types of stories.
This article is from the archive of our partner The Wire.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.