Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
A rock structure, built deep underground, is one of the earliest hominin constructions ever found.
In February 1990, thanks to a 15-year-old boy named Bruno Kowalsczewski, footsteps echoed through the chambers of Bruniquel Cave for the first time in tens of thousands of years.
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
Speculation about how Ramsay Bolton might die reveals the challenges of devising a cathartic TV death—and illuminates a larger issue facing the series.
Warning: Season 6 spoilers abound.
Ever since Ramsay Bolton revealed himself as Westeros’s villain-in-chief, Game of Thrones fans have wanted him dead. He first appeared in season two disguised as a Northern ally sent to help Theon Greyjoy but quickly turned out to be a lunatic whose appetite for cruelty only grew as the series progressed. (Last year, Atlantic readers voted him the actual worst character on television.) After several colorful and nauseating years of rape, torture, murder, and bad visual puns, speculation about the Bolton bastard’s looming death has reached its peak this sixth season. But “Will Ramsay die this season?” also gives way to a slightly more complicated question: “How should Ramsay die?”
What’s harder to believe: that it took a year for Andrea Constand to accuse the star of sexual assault, or that it’s taken 11 years and dozens more women coming forward for those accusations to be heard in court?
To date, more than 50 women have accused Bill Cosby of sexual misconduct. Constand was the first. In January of 2005 she told police that a year earlier, Cosby had touched and penetrated her after drugging her. A prosecutor decided against proceeding with the case, and Constand followed up with a civil suit that resulted in a 2006 settlement. After that came an accelerating drip of women making allegations about incidents spanning a wide swath of Cosby’s career, from Kristina Ruehli (1965) to Chloe Goins (2008).
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Washington voters handed Hillary Clinton a primary win, symbolically reversing the result of the state caucus where Bernie Sanders prevailed.
Washington voters delivered a bit of bad news for Bernie Sanders’s political revolution on Tuesday. Hillary Clinton won the state’s Democratic primary, symbolically reversing the outcome of the state’s Democratic caucus in March where Sanders prevailed as the victor. The primary result won’t count for much since delegates have already been awarded based on the caucus. (Sanders won 74 delegates, while Clinton won only 27.) But Clinton’s victory nevertheless puts Sanders in an awkward position.
Sanders has styled himself as a populist candidate intent on giving a voice to voters in a political system in which, as he describes it, party elites and wealthy special-interest groups exert too much control. As the primary election nears its end, Sanders has railed against Democratic leaders for unfairly intervening in the process, a claim he made in the aftermath of the contentious Nevada Democratic convention earlier this month. He has also criticized superdelegates—elected officials and party leaders who can support whichever candidate they chose—for effectively coronating Clinton.
For toymakers like Lego, where is the line between making products children love and telling kids how they should play?
Two years ago, a 7-year-old girl named Charlotte wrote a letter to the toymaker Lego with a straightforward request.
“I love Legos,” she wrote, “but I don’t like that there are more lego boy people and barely any lego girls.” The girls in the Lego universe, Charlotte had noticed, seemed preoccupied with sitting at home, going to the beach, and shopping—while the boys had jobs, saved people, and went on adventures.
Charlotte, Lego acknowledged, had a point. “It’s fair,” said Michael McNally, a Lego spokesman who says the company receives letters from kids all the time. “Why wouldn’t there be more female representation?”
Years before Charlotte sent her letter, Lego was already keenly focused on how girls perceived the brand. It was 2008 when the toymaker decided to gather global data about who buys Legos. What they found was startling. In the United States, roughly 90 percent of Lego sets being sold were intended for boys. In other words, there was a huge untapped market of girls who weren’t building with Legos.
In recent years, the idea that educators should be teaching kids qualities like grit and self-control has caught on. Successful strategies, though, are hard to come by.
In 2013, for the first time, a majority of public-school students in this country—51 percent, to be precise—fell below the federal government’s low-income cutoff, meaning they were eligible for a free or subsidized school lunch. It was a powerful symbolic moment—an inescapable reminder that the challenge of teaching low-income children has become the central issue in American education.
The truth, as many American teachers know firsthand, is that low-income children can be harder to educate than children from more-comfortable backgrounds. Educators often struggle to motivate them, to calm them down, to connect with them. This doesn’t mean they’re impossible to teach, of course; plenty of kids who grow up in poverty are thriving in the classroom. But two decades of national attention have done little or nothing to close the achievement gap between poor students and their better-off peers.
Whatever banking’s post-recession connotations may be, the historian William Goetzmann argues that monetary innovations have always played a critical role in developing civilization.
The title of the financial historian William Goetzmann’s new book is hard to argue with: Money Changes Everything.
In his book, Goetzmann, a professor of finance and the director of the International Center for Finance at the Yale School of Management, has documented how financial innovations—from the invention of money to capital markets—have always played a critical role in developing every culture around the world. In the fallout from the Great Recession, it’s been commonplace to vilify those working in the financial-services industry. But Goetzmann argues that finance is a worthwhile endeavor, beyond just earning a ton of money: Its innovations have made the growth of human civilization possible.
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
Now that the entertainer seems to have wrapped up the Republican nomination, who will he choose as his running mate?
For decades, a few antiquated bon mots about the vice presidency have held sway in discussions about running mates. For example, there’s Teddy Roosevelt’s declaration, “I would a great deal rather be anything, say professor of history, than vice president.” Even better was John Nance Garner’s verdict that the office he held under FDR was “not worth a bucket of warm piss.” Those quips really hardly apply anymore; they’re as archaic as their authors. These days the Naval Observatory is a nice place to land. You could end up amassing unprecedented power and a man-sized safe, like Dick Cheney. You could end up with impressive power andbecome an aviator-clad folk hero, like Joe Biden.
Or maybe not. Will anyone want to be the running mate to presumptive Republican nominee Donald Trump? There are the character risks in cozying up to a man who’s liable to make a racist comment or accuse a rival’s father of being involved in the Kennedy assassination. There are the career risks of becoming associated with a man who much of the Republican Party still doesn’t like. And there are the organizational risks to signing on as No. 2 to a man who’s famously a go-it-alone maverick.