Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
Meet the Bernie Sanders supporters who say they won’t switch allegiances, no matter what happens in the general election.
Loyal fans of Bernie Sanders have a difficult decision to make. If Hillary Clinton faces off against Donald Trump in the 2016 presidential election, legions of Sanders supporters will have to decide whether to switch allegiances or stand by Bernie until the bitter end.
At least some supporters of the Vermont senator insist they won’t vote for Clinton, no matter what. Many view the former secretary of state with her deep ties to the Democratic establishment as the polar opposite of Sanders and his rallying cry of political revolution. Throwing their weight behind her White House bid would feel like a betrayal of everything they believe.
These voters express unwavering dedication to Sanders on social media, deploying hashtags like NeverClinton and NeverHillary, and circulating petitions like www.wontvotehillary.com, which asks visitors to promise “under no circumstances will I vote for Hillary Clinton.” It’s garnered more than 56,500 signatures so far. Many feel alienated by the Democratic Party. They may want unity, but not if it means a stamp of approval for a political status quo they believe is fundamentally flawed and needs to be fixed.
The candidate has exposed the tension between democracy and liberal values—just like the Arab Spring did.
When I was living in the Middle East, politics always felt existential, in a way that I suppose I could never fully understand. After all, I could always leave (as my relatives in Egypt were fond of reminding me). But it was easy enough to sense it. Here, in the era of Arab revolt, elections really had consequences. Politics wasn’t about policy; it was about a battle over the very meaning and purpose of the nation-state. These were the things that mattered more than anything else, in part because they were impossible to measure or quantify.
The primary divide in most Arab countries was between Islamists and non-Islamists. The latter, especially those of a more secular bent, feared that Islamist rule, however “democratic” it might be, would alter the nature of their countries beyond recognition. It wouldn’t just affect their governments or their laws, but how they lived, what they wore, and how they raised their sons and daughters.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
Boosting your ego won’t make you feel better. Instead, try talking to yourself like you would your best friend.
In 1986, California state assemblyman John Vasconcellos came up with what he believed could be “a vaccine for major social ills” like teen pregnancy and drug abuse: a special task-force to promote self-esteem among Californians. The effort folded three years later, and was widely considered not to have accomplished much.
To Kristin Neff, a psychology professor at the University of Texas, that’s not surprising. Though self-esteem continues to reverberate as a pop-psych cure-all, the quest for inflated egos, in her view, is misguided and largely pointless.
There’s nothing wrong with being confident, to answer Demi Lovato’s question. The trouble is how we try to achieve high self-regard. Often, it’s by undermining others or comparing our achievements to those around us. That’s not just unsustainable, Neff argues, it can also lead to narcissism or depressive bouts during hard times.
There’s no escaping the pressure that U.S. inequality exerts on parents to make sure their kids succeed.
More than a half-century ago, Betty Friedan set out to call attention to “the problem that has no name,” by which she meant the dissatisfaction of millions of American housewives.
Today, many are suffering from another problem that has no name, and it’s manifested in the bleak financial situations of millions of middle-class—and even upper-middle-class—American households.
Poverty doesn’t describe the situation of middle-class Americans, who by definition earn decent incomes and live in relative material comfort. Yet they are in financial distress. For people earning between $40,000 and $100,000 (i.e. not the very poorest), 44 percent said they could not come up with $400 in an emergency (either with cash or with a credit card whose bill they could pay off within a month). Even more astonishing, 27 percent of those making more than $100,000 also could not. This is not poverty. So what is it?
If pushed, most people would say, “It’s discriminatory.” That’s the answer my Con Law students often give about various hypothetical statutes. They’re always correct, and always wrong, because all laws are “discriminatory.” Driver’s-license laws and drinking laws discriminate on the basis of age, for example. Immigration law discriminates on the basis of birthplace and citizenship. Tax laws discriminate on residence, income level, home ownership, and occupation.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
Stuffed to overflowing with superheroes, the studio’s latest nonetheless understands that character is key.
Way back in 2012, I was genuinely astonished by the cinematic juggling act that Joss Whedon accomplished in The Avengers. Six heroes pulled from widely different walks of super-life: Who could believe he’d manage to integrate them all into a coherent story?
These days, that challenge looks rudimentary. A year ago, Whedon’s The Avengers: Age of Ultron found space to squeeze in three more heroes and a brand-new super-villain, along with another half-dozen characters from the ever-expanding Marvel universe. And now, in Captain America: Civil War—which serves in many respects as a third Avengers movie—we have fully a dozen heroes divvied up into two competing super-teams. At this rate, pretty soon Marvel Studios honcho Kevin Feige will have to rent out a stadium just to accommodate his lycra-clad swarms.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.