Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
The First Lady took to the stage at the Democratic National Convention, and united a divided hall.
Most convention speeches are forgotten almost before they’re finished. But tonight in Philadelphia, Michelle Obama delivered a speech that will be replayed, quoted, and anthologized for years. It was as pure a piece of political oratory as this campaign has offered, and instantly entered the pantheon of great convention speeches.
Obama stepped out onto a stage in front of a divided party, including delegates who had booed almost every mention of the presumptive nominee. And she delivered a speech that united the hall, bringing it to its feet.
She did it, moreover, her own way—forming a striking contrast with the night’s other speakers. She did it without shouting at the crowd. Without overtly slamming Republicans. Without turning explicitly negative. Her speech was laden with sharp barbs, but she delivered them calmly, sometimes wryly, biting her lower lip, hitting her cadence. It was a masterful performance.
When something goes wrong, I start with blunder, confusion, and miscalculation as the likely explanations. Planned-out wrongdoing is harder to pull off, more likely to backfire, and thus less probable.
But it is getting more difficult to dismiss the apparent Russian role in the DNC hack as blunder and confusion rather than plan.
“Real-world” authorities, from the former U.S. Ambassador to Russia to FBI sources to international security experts, say that the forensic evidence indicates the Russians. No independent authority strongly suggests otherwise. (Update the veteran reporters Shane Harris and Nancy Youssef cite evidence that the original hacker was “an agent of the Russian government.”)
The timing and precision of the leaks, on the day before the Democratic convention and on a topic intended to maximize divisions at that convention, is unlikely to be pure coincidence. If it were coincidence, why exactly now, with evidence drawn from hacks over previous months? Why mail only from the DNC, among all the organizations that have doubtless been hacked?
The foreign country most enthusiastic about Trump’s rise appears to be Russia, which would also be the foreign country most benefited by his policy changes, from his sowing doubts about NATO and the EU to his weakening of the RNC platform language about Ukraine.
For the party elders, day one of the convention was about scolding the left back together.
Against a restive backdrop, the party’s top lieutenants were forced into the role of prime time peacemakers, tasked with encouraging Democratic unity in a party that has only lately acquiesced to tenuous detente. They did so through a combination of alarmist truth telling—borne from the reality of a Trump-Clinton matchup that has lately gotten tighter—and cold-water scolding about party division—driven equally by frustration and exhaustion.
The pressures of national academic standards have pushed character education out of the classroom.
A few months ago, I presented the following scenario to my junior English students: Your boyfriend or girlfriend has committed a felony, during which other people were badly harmed. Should you or should you not turn him or her into the police?
The class immediately erupted with commentary. It was obvious, they said, that loyalty was paramount—not a single student said they’d “snitch.” They were unequivocally unconcerned about who was harmed in this hypothetical scenario. This troubled me.
This discussion was part of an introduction to an essay assignment about whether Americans should pay more for ethically produced food. We continued discussing other dilemmas, and the kids were more engaged that they’d been in weeks, grappling with big questions about values, character, and right versus wrong as I attempted to expand their thinking about who and what is affected—and why it matters—by their caloric choices.
The Democratic chairwoman had few supporters—but clung to her post for years, abetted by the indifference of the White House.
PHILADELPHIA—As Debbie Wasserman Schultz made her unceremonious exit as chairwoman of the Democratic National Committee, what was most remarkable was what you didn’t hear: practically anybody coming to her defense.
The Florida congresswoman did not go quietly. She reportedly resisted stepping down, and blamed subordinates for the content of the leaked emails that were released Friday, which clearly showed the committee’s posture of neutrality in the Democratic primary to have been a hollow pretense, just as Bernie Sanders and his supporters long contended. She finally relinquished the convention gavel only after receiving three days of strong-arming, a ceremonial position in the Clinton campaign, and a raucous round of boos at a convention breakfast.
Psychologists have long debated how flexible someone’s “true” self is.
Almost everyone has something they want to change about their personality. In 2014, a study that traced people’s goals for personality change found that the vast majority of its subjects wanted to be more extraverted, agreeable, emotionally stable, and open to new experiences. A whopping 97 percent said they wished they were more conscientious.
These desires appeared to be rooted in dissatisfaction. People wanted to become more extraverted if they weren’t happy with their sex lives, hobbies, or friendships. They wanted to become more conscientious if they were displeased with their finances or schoolwork. The findings reflect the social psychologist Roy Baumeister’s notion of “crystallization of discontent”: Once people begin to recognize larger patterns of shortcomings in their lives, he contends, they may reshuffle their core values and priorities to justify improving things.
Two new novels ponder the still-urgent question of what could have compelled young women to do such terrible things.
The most fascinating part of the Manson story has always been the girls.
Not the man who cobbled together bits of hippie philosophy, Scientology and How to Win Friends and Influence People to gather followers who’d do his bidding and help make him a star (and when that didn’t work out, kill people to try to start a race war). The ones willing and vulnerable enough to be gathered. Who wanted a community to belong to.
Even now, no one knows whether Charles Manson believed his own insane manifesto, or was just using it as a tool to get what he wanted. But the girls believed. Patricia Krenwinkel, Leslie Van Houten, Susan Atkins—they believed. They belonged. And then, on two infamous evenings in 1969, they helped kill seven people.
Stock-market crashes, terrorist attacks, and the dark side of “newsworthy” stories
Man bites dog. It is one of the oldest cliches in journalism, an acknowledgement of the idea that ordinary events are not newsworthy, whereas oddities, like a puppy-nibbling adult, deserve disproportionate coverage.
The rule is straightforward, but its implications are subtle. If journalists are encouraged to report extreme events, they guide both elite and public attitudes, leading many people, including experts, to feel like extreme events are more common than they actually are. By reporting on only the radically novel, the press can feed a popular illusion that the world is more terrible than it actually is.
Take finance, for example. Professional investors are fretting about the possibility of a massive stock-market crash, on par with 1987’s Black Monday. The statistical odds that such an event will occur within the next six months are about 1-in-60, according to historical data from 1929 to 1988. But when surveys between 1989 and 2015 asked investors to estimate the odds of such a crash in the coming months, the typical response was 1-in-10.
Physicists can’t agree on whether the flow of future to past is real or a mental construct.
Einstein once described his friend Michele Besso as “the best sounding board in Europe” for scientific ideas. They attended university together in Zurich; later they were colleagues at the patent office in Bern. When Besso died in the spring of 1955, Einstein—knowing that his own time was also running out—wrote a now-famous letter to Besso’s family. “Now he has departed this strange world a little ahead of me,” Einstein wrote of his friend’s passing. “That signifies nothing. For us believing physicists, the distinction between past, present, and future is only a stubbornly persistent illusion.”
Einstein’s statement was not merely an attempt at consolation. Many physicists argue that Einstein’s position is implied by the two pillars of modern physics: Einstein’s masterpiece, the general theory of relativity, and the Standard Model of particle physics. The laws that underlie these theories are time-symmetric—that is, the physics they describe is the same, regardless of whether the variable called “time” increases or decreases. Moreover, they say nothing at all about the point we call “now”—a special moment (or so it appears) for us, but seemingly undefined when we talk about the universe at large. The resulting timeless cosmos is sometimes called a “block universe”—a static block of space-time in which any flow of time, or passage through it, must presumably be a mental construct or other illusion.