Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
About 10 years ago, after I’d graduated college but when I was still waitressing full-time, I attended an empowerment seminar. It was the kind of nebulous weekend-long event sold as helping people discover their dreams and unburden themselves from past trauma through honesty exercises and the encouragement to “be present.” But there was one moment I’ve never forgotten. The group leader, a man in his 40s, asked anyone in the room of 200 or so people who’d been sexually or physically abused to raise their hands. Six or seven hands tentatively went up. The leader instructed us to close our eyes, and asked the question again. Then he told us to open our eyes. Almost every hand in the room was raised.
And there could be far-reaching consequences for the national economy too.
Four floors above a dull cinder-block lobby in a nondescript building at the Ohio State University, the doors of a slow-moving elevator open on an unexpectedly futuristic 10,000-square-foot laboratory bristling with technology. It’s a reveal reminiscent of a James Bond movie. In fact, the researchers who run this year-old, $750,000 lab at OSU’s Spine Research Institute resort often to Hollywood comparisons.
Thin beams of blue light shoot from 36 of the same kind of infrared motion cameras used to create lifelike characters for films like Avatar. In this case, the researchers are studying the movements of a volunteer fitted with sensors that track his skeleton and muscles as he bends and lifts. Among other things, they say, their work could lead to the kind of robotic exoskeletons imagined in the movie Aliens.
Four decades ago Jimmy Carter was sworn in as the 39th president of the U.S., the original Star Wars movie was released in theaters, and much more.
Four decades ago Jimmy Carter was sworn in as the 39th president of the United States, the original Star Wars movie was released in theaters, the Trans-Alaska pipeline pumped its first barrels of oil, New York City suffered a massive blackout, Radio Shack introduced its new TRS-80 Micro Computer, Grace Jones was a disco queen, the Brazilian soccer star Pele played his “sayonara” game in Japan, and much more. Take a step into a visual time capsule now, for a brief look at the year 1977.
In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.
Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.
Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
How a seemingly innocuous phrase became a metonym for the skewed sexual politics of show business
The chorus of condemnation against Harvey Weinstein, as dozens of women have come forward to accuse the producer of serial sexual assault and harassment, has often turned on a quaint-sounding show-business cliché: the “casting couch.” Glenn Close, for instance, expressed her anger that “the ‘casting couch’ phenomenon, so to speak, is still a reality in our business and in the world.”
The casting couch—where, as the story goes, aspiring actresses had to trade sexual favors in order to win roles—has been a familiar image in Hollywood since the advent of the studio system in the 1920s and ’30s. Over time, the phrase has become emblematic of the way that sexual aggression has been normalized in an industry dominated by powerful men.
A driver, a transportation official, and a transit advocate explain why Seattle recently saw one of the biggest citywide increases in passenger numbers.
Almost every major U.S. city has seen years of decline in bus ridership, but Seattle has been the exception in recent years. Between 2010 and 2014, Seattle experienced the biggest jump of any major U.S. city. At its peak in 2015, around 78,000 people, or about one in five Seattle workers, rode the bus to work.
That trend has cooled slightly since then, but Seattle continues to see increased overall transit ridership, bucking the national trend of decline. In 2016, Seattle saw transit ridership increase by 4.1 percent—only Houston and Milwaukee saw even half that increase in the same year.
Bus service is crucial to reducing emissions in the Seattle region. According to King County Metro, which serves the region, nearly half of all greenhouse gas emissions in Washington state come from transportation and its operation displaces roughly four times as many emissions as it generates, by taking cars off the road and reducing traffic congestion. The public transit authority has been recognized for its commitment to sustainability and its bus fleet is projected to be 100 percent hybrid or electric by 2018.
The president managed to cause a brief firestorm by falsely accusing predecessors of neglecting slain soldiers, but real answers about why four men were killed are still elusive.
On October 4, four American Special Forces soldiers were killed during an operation in Niger. Since then, the White House has been notably tight-lipped about the incident. During a press conference Monday afternoon, 12 days after the deaths, President Trump finally made his first public comments, but the remarks—in which he admitted he had not yet spoken with the families and briefly attacked Barack Obama—did little to clarify what happened or why the soldiers were in Niger.
Trump spoke at the White House after a meeting with Senate Majority Leader Mitch McConnell, and was asked why he hadn’t spoken about deaths of Sergeant La David Johnson and Staff Sergeants Bryan Black, Dustin Wright, and Jeremiah Johnson.
For the first time, astronomers have detected visible light and gravitational waves from the same source, ushering in a new era in our attempt to understand the cosmos.
In September of 2015, astronomers detected, for the first time, gravitational waves, cosmic ripples that distort the very fabric of space and time. They came from a violent merger of two black holes somewhere in the universe, more than a billion light-years away from Earth. Astronomers observed the phenomenon again in December, and then again in November 2016, and then again in August of this year. The discoveries confirmed a century-old prediction by Albert Einstein, earned a Nobel prize, and ushered in a new field of astronomy.
But while astronomers could observe the effects of the waves in the sensitive instruments built to detect them, they couldn’t see the source. Black holes, as their name suggests, don’t emit any light. To directly observe the origin of gravitational waves, astronomers needed a different kind of collision to send the ripples Earth’s way. This summer, they finally got it.