Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Without the financial support that many white families can provide, minority young people have to continually make sacrifices that set them back.
The year after my father died, I graduated from grad school, got a new job, and looked forward to saving for a down payment on my first home, a dream I had always had, but found lofty. I pulled up a blank spreadsheet and made a line item called “House Fund.”
That same week I got a call from my mom—she was struggling to pay off my dad’s funeral expenses. I looked at my “House Fund” and sighed. Then I deleted it and typed the words “Funeral Fund” instead.
My father’s passing was unexpected. And so was the financial burden that came with it.
For many Millennials of color, these sorts of trade-offs aren’t an anomaly. During key times in their lives when they should be building assets, they’re spending money on basic necessities and often helping out family. Their financial future is a rocky one, and much of it comes down to how much—or how little—assistance they receive.
Maya Arulpragasam is a famous rapper, singer, designer, producer, and refugee. When she was 9, her mother and siblings fled violence in Sri Lanka and came to London, and the experience was formative for her art. As she explained to The Guardian in 2005 after the release of her debut Arular, “I was a refugee because of war and now I have a voice in a time when war is the most invested thing on the planet. What I thought I should do with this record is make every refugee kid that came over after me have something to feel good about. Take everybody’s bad bits and say, ‘Actually, they’re good bits. Now whatcha gonna do?’”
That goal—to glorify people and practices that the developed world marginalizes—has been a constant in her career. Her new music video tackles it in a particularly literal and urgent way, not only by showing solidarity with refugees at a moment when they’re extremely controversial in the West, but also by posing a simple question to listeners: Whose lives do you value?
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Jeb Bush, John Kasich, and other presidential contenders appease Donald Trump at their own peril.
Give Donald Trump this: He has taught Americans something about the candidates he’s running against. He has exposed many of them as political cowards.
In August, after Trump called undocumented Mexican immigrants “rapists” and vowed to build a wall along America’s southern border, Jeb Bush traveled to South Texas to respond. Bush’s wife is Mexican American; he has said he’s “immersed in the immigrant experience”; he has even claimed to be Hispanic himself. Yet he didn’t call Trump’s proposals immoral or bigoted, since that might offend Trump’s nativist base. Instead, Bush declared: “Mr. Trump’s plans are not grounded in conservative principles. His proposal is unrealistic. It would cost hundreds of billions of dollars.” In other words, demonizing and rounding up undocumented Mexican immigrants is fine, so long as it’s done cheap.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
To solve climate change, we need to reimagine our entire relationship to the nonhuman world.
Humans were once a fairly average species of large mammals, living off the land with little effect on it. But in recent millennia, our relationship with the natural world has changed as dramatically as our perception of it.
There are now more than 7 billion people on this planet, drinking its water, eating its plants and animals, and mining its raw materials to build and power our tools. These everyday activities might seem trivial from the perspective of any one individual, but aggregated together they promise to leave lasting imprints on the Earth. Human power is now geological in scope—and if we are to avoid making a mess of this, our only home, our politics must catch up.
Making this shift will require a radical change in how we think about our relationship to the natural world. That may sound like cause for despair. After all, many people refuse to admit that environmental crises like climate change exist at all. But as Jedediah Purdy reminds us in his dazzling new book, After Nature, our relationship with the nonhuman world has proved flexible over time. People have imagined nature in a great many ways across history.
The generation has been called lazy, entitled, and narcissistic. Their bosses beg to differ.
Yes, many Millennials are still crashing on their parent’s couches. And there’s data to support the claim that they generally want more perks but less face time, and that they hope to rise quickly but don’t stick around for very long. Millennials have also been pretty vocal about their desire to have more flexible jobs and more leave time.
But does all of this mean that all Millennials are actually worse workers?
Laura Olin, a digital campaigner who ran social-media strategy for President Obama’s 2012 campaign, says that’s not been her experience. “You always hear about Millennials supposedly being entitled and needing coddling, but the ones I’ve encountered have been incredibly hard-working and recognize that they need to pay their dues.”
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
What I learned from attending a town-hall meeting and listening to students’ concerns
Sometimes it takes a group of young people to set you straight.
For months now, I’ve been reading about college students who’ve been seeking “safe spaces.” They’ve often been met by derision—even the highest ranked Urban Dictionary definition is mired in sarcasm, describing them as having “pillows” and “soothing music” that “allows them to recover from the trauma... of exposure to ideas that conflict with their leftist professors.”
I also had some mid-life skepticism about teenage hyperbole, that is, until I attended a town hall meeting at Duke University (my alma mater) earlier this month. The “community conversation,” as it was called, had been hastily convened to discuss the rash of racist and homophobic incidents on campus. Listening to those students—and watching their expressions—I realized that what’s been happening at Duke is serious, and no amount of sarcasm can disguise the pain and anger on campus, or cover up the real dangers lurking there.