The Day Yahoo Decided I Liked Reading About Child Murder

More

Algorithms are shaping how we see the world around us, with big consequences. What a machine thinks we need to know can become what we fear.

yahoonews-grimreaper.png
Yahoo News/Rebecca J. Rosen

On February 8, 2012, I was on Yahoo's homepage when a headline caught my eye: "Mo. teen gets life with possible parole in killing." Curious, I clicked to see what atrocity had transpired in the state where I live. Alyssa Bustamante, a teenager from Jefferson City, had strangled and stabbed her nine-year-old neighbor for the sheer thrill of it, later describing the event in her diary as an "ahmazing" experience. Horrified, I closed the page. Like many whose homepage defaults to Yahoo, this quick scan of a story was a rote action, information via procrastination, almost subconsciously performed every morning before I move on to other things. In this case, the story was so awful that I wanted to get away. Except, it turned out, I couldn't.

For the next month, I woke up to a barrage of horrifying stories that seemed to signal an epidemic of child torture in America. "3-year-old recovering after swallowing 37 powerful magnets," Yahoo solemnly informed me on March 5; "Police: Alaska girl locked in frigid bedroom dies" on March 6. Occasionally the child in question survived their ordeal ("7-year-old boy survives brush with tornado in North Carolina", March 4) but more often than not they were the adversary ("Boy, 9, charged in shooting of third-grade classmate", February 23; "11-year-old California girl dies after fight with classmate", February 26; "Texas boy, 12, accused of brandishing loaded gun", February 27; "10-year-old girl's death in fight with student ruled homicide", February 27).  

I rarely clicked on any of these headlines, and at first, I didn't notice the way they had crept into my Yahoo homepage -- and into my mind -- until their pervasiveness became impossible to ignore.

That's when I realized: Yahoo had decided I liked child murder.

* * *

"If there is one unambiguous trend in how the Internet is developing today," writes Evgeny Morozov, "it's the drive toward the personalization of our online experience. Everything we click, read, search, and watch online is increasingly the result of some delicate optimization effort, whereby our previous clicks, searches, 'likes,' purchases, and interactions determine what appears in our browsers and apps."

Morozov was writing about algorithmic optimization, a concept outlined in Eli Pariser's "The Filter Bubble," which describes the way that websites like Yahoo and Google tailor what they show someone according to the previous online activity of that user. By capitalizing on what are assumed to be your pre-existing interests, it intends to make you more likely to read stories or click on ads.

Opponents of the practice, like Parisier, fear that filter bubbles prevent users from experiencing viewpoints other than their own. They strip online worlds of their serendipity, imprisoning users in an informational comfort zone. But I had the opposite experience: child murder was my presumed interest. Yahoo News had become my own personal Hunger Games, making me a spectator to violence I would never voluntarily seek out.  

Filter bubbles are usually criticized on material or political grounds: They reinforce pre-existing tastes, manipulate consumers into buying products, and limit knowledge of opposing views. But what if the filter is wrong? What if it's not a true reflection, but a false mirror -- one that does not respond to fears and prejudices, but creates them?

* * *

We live in one of the safest eras in recent history to raise kids. Violence against children has dropped over the past four decades, yet the perception that times are more dangerous has made American parents more over-protective than ever before. David Robert Hogg, who runs the parenting website My Little Nomads, attributes this to the "TV bubble -- where the world is filled with risks and crime and violence." Hogg supports the philosophy, popularized by journalist Lenore Skenazy, of "free range kids": a movement that seeks to debunk media myths of rampant child endangerment and encourage a more laissez-faire approach.

Is the filter bubble replacing the TV bubble? It's possible, but, crucially, it's impossible to tell. Unlike tabloid television, algorithmic personalization does not announce that it's pandering to base interests. When sensationalized reports about violence against children are on TV, I can change the channel -- an act that is harder to do on the Internet when seemingly "neutral" spaces, like Yahoo's homepage, leave no tell-tale trace of manipulation.  You can't change the channel when you don't know you're watching the program.

Yahoo personalizes headlines for its audience of over 700 million people through its Content Optimization and Relevance Engine, an algorithmic system based on demographic data and reading behavior. As a researcher who studies digital media, I was aware that my news was filtered, but I had never noticed the filtering process in action, probably because, until now, Yahoo had guessed me right. (Or at least not so gruesomely wrong.) When the headlines first appeared, I thought they were an anomaly, but as the weeks went on, I noticed a pattern. Going through my search history, I could trace the emergence of stories to the day I read about the Bustamante murder.

What I could not determine was whether I was alone in receiving such lurid headlines. When television news airs stories on murdered children, people can complain to each other about the sordid exploitation, the manipulation of tragedy for ratings gain. A mass media event can be dissected, and protested, by the masses.

In my case, the focus could only turn inward: why was Yahoo feeding me such grisly material? Was it really because I had clicked on that one story? Was it because I am a mom in Missouri, and this is what they think moms in Missouri like to read? Yahoo had become a character assessment -- not one to be taken seriously, but one that turned news consumption into self-analysis, or at least an analysis of my algorithmic analogue: What did the headlines say about me?

* * *

In 2010, noted technophobe Jonathan Franzen revealed his fondness for AOL: "AOL's little box -- the welcome screen, they call it, I guess -- is so infuriating in its dopiness: 'Surprising Leader In The Masters! Find Out Who!' 'Ten Things To Think About When Choosing A Hotel!' 'What Smart Travelers Know About X!' It's all in compact form, and it kind of tells me everything I need to know about the larger stupidity. It helps keep me in touch."

Condescension aside, Franzen has a point. Portals like AOL or Yahoo, with their mix of gossip and politics and recipes and celebrity death watch masquerading as "trending now" (Larry Hagman? Ernest Borgnine?), feel like a throwback to a time of inclusive, if dim-witted, media. I am not alone in my taste for the larger stupidity: Yahoo is the most popular online news source in the world. Unlike a website whose sole purpose is news, Yahoo's headlines seem like too much of an afterthought to be pointed. Unlike Facebook or Google, with their mercurial platforms and pretense to philosophy, Yahoo seems too uncool to control you.

Yet that might be the reason it is effective in doing so. Few perusing Yahoo headlines would suspect that children murdering children is a reader category chosen by robots. While disturbing on an epistemological level, it may also have practical consequences. As we rely on internet media to give us a taste of what's going on, we don't realize we're consuming a particular flavor. A sudden uptick in stories on violence -- particularly by or against a specific demographic category -- can spur paranoia, prejudice and vigilante behavior. What a machine thinks we need to know can become what we fear. But because the algorithmic process is both secret and subjective, we have no way of tracking the ramifications.

Media organizations have long been accused of bias. Social media shifted that bias from the organization to the user -- the filter bubble of news chosen by friends, the friends themselves filtered by assumed similarities. But now we must contend with the bias of a false version of ourselves. Yahoo's murder feed exposes the algorithmic process for what it is:  personalization without the person.

Jump to comments
Presented by

Sarah Kendzior is an anthropologist who studies politics and the internet in Central Asia.

Get Today's Top Stories in Your Inbox (preview)

The Time JFK Called the Air Force to Complain About a 'Silly Bastard'

51 years ago, President John F. Kennedy made a very angry phone call.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Adventures in Legal Weed

Colorado is now well into its first year as the first state to legalize recreational marijuana. How's it going? James Hamblin visits Aspen.

Video

What Makes a Story Great?

The storytellers behind House of CardsandThis American Life reflect on the creative process.

Video

Tracing Sriracha's Origin to Thailand

Ever wonder how the wildly popular hot sauce got its name? It all started in Si Racha.

Video

Where Confiscated Wildlife Ends Up

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Writers

Up
Down

More in Technology

Just In