Before you watch the Super Bowl tonight, you could, should you be so inclined, head over to YouTube and watch a preview of an ad Kia will be airing during the game. The spot features the Victoria's Secret model Adriana Lima wearing very little and doing even less: She spends the entirety of the ad, hilariously and (one presumes) at least partially satirically, swaying, saying nothing, and waving a checkered racing flag. Very, very slowly.
Super Bowl commercials (the experience of, the economics of, etc.) used to be pretty straightforward: Advertisers would gladly pay tons of money for a slot during the game's broadcast because an ad aired during the game's broadcast was an amazingly efficient way of getting a message out to tons of people. That's still the case -- a 30-second space is going, this year, for $3.5 million, up from $3 million last year -- but the mechanics of the messaging are changing, and rapidly. Super Bowl ads are no longer simply ads, in the Traditional Teevee sense; they're campaigns that play out, strategically, over time. Instead of functioning as commercial broadcasts unto themselves, they're acting more and more like episodic touchpoints for an expansive cultural conversation.
In part, that's about marketers racing each other for relevance in an environment where marketing messages no longer need to be confined to TV. But it's a bigger story, too -- of communications, overall, breaking free of the boxes that used to contain them. One function of the media, traditionally, has been the regulation not just of information, and not just of entertainment, but of time itself. Our broadcast networks, in particular, have segmented time into neat little boxes -- 30 seconds here, 30 minutes there -- and populated them with sounds and images that entertain and (occasionally) edify us. They have plotted our days into grids, scheduling our experience and helping us to forget that, in fact, there's very little that's natural about a time slot.
Super Bowl ads have been pretty much the Platonic culmination of the gridded media system. They have operated on the assumption that a Big Event itself (the experience of, the economics of) is significant not just because of its content, but because of the community it convenes (111 million people!). The Super Bowl is time rendered collective and contained -- so of course marketers want to buy themselves a chunk of it. When better to make your pitch to the world than during the period when the maximum amount of eyes are focused on, effectively, the same screen?
YouTube, and social networks in general, encourage precisely the opposite marketing model. Rather than containing consumer attention, they disperse it. They take the typical 30-second ad spot and condense it to five seconds ... or expand it to five hours. Or both. Or neither. It doesn't matter, because digital spaces remove time as both a constraint and a value in commercial production, allowing for marketing that insinuates itself on its intended audiences much more slowly, and much more manipulatively, and potentially much more effectively, than its analog counterparts.
You'd think all that would be bad news for broadcast networks, with marketers trading YouTube for boob tube and abandoning the pricey Super Bowl altogether. Why buy the milk, and all that. But: Not only are marketers continuing to pay for something they could ostensibly get for free; they're paying more for it than they ever have before. They're still finding value -- millions of dollars worth of it -- in the connective consciousness that the Super Bowl represents.
And that's because, in a world of atomized attention, anything that can aggregate us is becoming more valuable than it's ever been before. Ads aired during the Super Bowl aren't just ads; they're Super Bowl ads. That branding will give them a spot -- and a continued life -- in Monday's write-ups of Sunday's best Super Bowl spots, and in all those "Super Bowl Ads: 2012" collections that will function as archives for future generations. Their context will make them more than what they are. And that will make them, implicitly, more engaging than they might be otherwise. Super Bowl ads, as my colleague Jordan Weissmann has pointed out, have been found to be 58 percent more memorable than regular ads. And while that's partly, sure, because those ads generally represent the best stuff that J. Walter and friends have to offer, it's also because the ads, aired when they are, adopt the warmth of assumed connection that convened attention can confer. I am watching Matthew Broderick as 110,999,999 other people do. There is something epic -- and rare -- about that.
So Super Bowl ads are increasingly valuable because the kind of mass-conscious event they're part of is increasingly rare. Mass-ness itself is increasingly rare. Overall, in the U.S., TV viewership is declining. Audiences are fragmenting. The Gladwellian connectors that used to bring us together -- Lucy, J.R., Oscar -- are departing, leaving individual impulse as the driver of our time. This is wonderful, and liberating, but introduces its own set of quandaries. TV Guide, after all, wasn't just a guide book; it was a framework. It was a power structure. It assembled us, effortlessly, within its neat little boxes. By limiting our experience, it also connected our experience.
No longer. Increasingly, we're looking to social networks rather than TV networks for our entertainment, for our information, for our sense of the world. And those social networks are fluid and box-less and limitless in a way that traditional media never could be. What happens to events themselves -- those shared moments of cultural connection -- in a world where time is unconstrained? Is a Super Bowl ad really a Super Bowl ad when I can watch it long before kickoff?
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
No one will ever find a closer exoplanet—now the race is on to see if there is life on its surface.
One hundred and one years ago this October, a Scottish astronomer named Robert Innes pointed a camera at a grouping of stars near the Southern Cross, the defining feature of the night skies above his adopted Johannesburg. He was looking for a small companion to Alpha Centauri, our closest neighboring star system.
Hunched over glass photographic plates, Innes teased out a signal. Across five years of images, a small, faint star moved, wiggling on the sky. It shifted just as much as Alpha Centauri, suggesting its fate was intertwined with that binary system. But this small star was closer to the sun than Alpha. Innes suggested calling it Proxima Centauri, using the Latin word for “nearest.”
The dim red star soon entered the collective imagination, inspiring dreams of interstellar travel. Gravity has linked the star to the Alpha Centauri system, but our culture of science and storytelling has linked it to the solar system. Today, that link will grow stronger, when an international team of astronomers announces that this nearest of stars also hosts the closest exoplanet, one that might look a whole lot like Earth.
Do mission-driven organizations with tight budgets have any choice but to demand long, unpaid hours of their staffs?
Earlier this year, at the encouragement of President Obama, the Department of Labor finalized the most significant update to the federal rules on overtime in decades. The new rules will more than double the salary threshold for guaranteed overtime pay, from about $23,000 to $47,476. Once the rules go into effect this December, millions of employees who make less than that will be guaranteed overtime pay under the law when they work more than 40 hours a week.
Unsurprisingly, some business lobbies and conservatives disparaged the rule as unduly burdensome. But pushback also came from what might have been an unexpected source: a progressive nonprofit called the U.S. Public Interest Research Group (PIRG). “Doubling the minimum salary to $47,476 is especially unrealistic for non-profit, cause-oriented organizations,” U.S. PIRG said in a statement. “[T]o cover higher staffing costs forced upon us under the rule, we will be forced to hire fewer staff and limit the hours those staff can work—all while the well-funded special interests that we're up against will simply spend more.”
City dwellers spend nearly every moment of every day awash in wi-fi signals. Homes, streets, businesses, and office buildings are constantly blasting wireless signals every which way for the benefit of nearby phones, tablets, laptops, wearables, and other connected paraphernalia.
When those devices connect to a router, they send requests for information—a weather forecast, the latest sports scores, a news article—and, in turn, receive that data, all over the air. As it communicates with the devices, the router is also gathering information about how its signals are traveling through the air, and whether they’re being disrupted by obstacles or interference. With that data, the router can make small adjustments to communicate more reliably with the devices it’s connected to.
This much is obvious: Young people don’t buy homes like they used to.
In the aftermath of the recession and weak recovery, the share of 18- to- 34 year olds—a.k.a.: Millennials—who own a home has fallen to a 30-year low. For the first time on record going back more than a century, young people are now more likely to live with their parents than with a spouse.
It’s become en vogue to argue that young people’s turn against homeownership might be a good thing. After all, houses are not always dependable investment vehicles, a lesson the country learned all too painfully after the Great Recession. Without being anchored to any one city from their mid-20s and into their 30s, young people who don’t own are free to roam about the country in search of the best jobs. What’s more, given the copious advantages of a college degree in this economy, perhaps many young people could be commended for investing in their intelligence, professional networks, and abilities rather than devote that same income to a roof, floor, and furniture.
Donald Trump’s campaign manager says he’s actually winning, thanks to “undercover” supporters. Plenty of past presidential hopefuls have mistakenly believed the same.
A candidate or operative on a campaign that's losing has three options: despair; accept what’s happening and try to fix it; or deny. Right now, the Donald Trump campaign is exhibiting all three.
For despair, there are the staffers who are reportedly “suicidal” inside Trump Tower, and those who have simply quit. For acceptance, Trump himself has admitted he’s in trouble. But newly promoted campaign manager Kellyanne Conway is taking the denial route.
“Donald Trump performs consistently better in online polling where a human being is not talking to another human being about what he or she may do in the election,” she told the British outlet Channel 4. “It’s because it’s become socially desirable, if you’re a college educated person in the United States of America, to say that you’re against Donald Trump.”
The U.S. presidential nominee’s anti-Islam rhetoric has motivated some to speak out against stereotypes.
Donald Trump has effectively declared Muslims the enemy, accusing them of shielding terrorists in their midst, pushing to ban them from entering the country, and suggesting that the United States should start thinking seriously about profiling them. In response, some American Muslim women are speaking out against Trump and his anti-Muslim rhetoric.
“I never really felt like I was ‘the other’ until now,” said Mirriam Seddiq, a 45-year-old immigration and criminal-defense lawyer from Northern Virginia who recently started a political-action committee called American Muslim Women. “It’s a strange realization to have, but it’s what motivated me to do this. There are so many misconceptions about Muslim women, and I want to help counter that narrative.”
Finally, an explanation for Bitchy Resting Face Nation
Here’s something that has always puzzled me, growing up in the U.S. as a child of Russian parents. Whenever I or my friends were having our photos taken, we were told to say “cheese” and smile. But if my parents also happened to be in the photo, they were stone-faced. So were my Russian relatives, in their vacation photos. My parents’ high-school graduation pictures show them frolicking about in bellbottoms with their young classmates, looking absolutely crestfallen.
It’s not just photos: Russian women do not have to worry about being instructed by random men to “smile.” It is Bitchy Resting Face Nation, seemingly forever responding “um, I guess?” to any question the universe might pose.
This does not mean we are all unhappy! Quite the opposite: The virile ruler, the vodka, the endless mounds of sour cream—they are pleasing to some. It’s just that grinning without cause is not a skill Russians possess or feel compelled to cultivate. There’s even a Russian proverb that translates, roughly, to “laughing for no reason is a sign of stupidity.”
A new survey suggests the logistics of going to services can be the biggest barrier to participation—and Americans’ faith in religious institutions is declining.
The standard narrative of American religious decline goes something like this: A few hundred years ago, European and American intellectuals began doubting the validity of God as an explanatory mechanism for natural life. As science became a more widely accepted method for investigating and understanding the physical world, religion became a less viable way of thinking—not just about medicine and mechanics, but also culture and politics and economics and every other sphere of public life. As the United States became more secular, people slowly began drifting away from faith.
Of course, this tale is not just reductive—it’s arguably inaccurate, in that it seems to capture neither the reasons nor the reality behind contemporary American belief. For one thing, the U.S. is still overwhelmingly religious, despite years of predictions about religion’s demise. A significant number of people who don’t identify with any particular faith group still say they believe in God, and roughly 40 percent pray daily or weekly. While there have been changes in this kind of private belief and practice, the most significant shift has been in the way people publicly practice their faith: Americans, and particularly young Americans, are less likely to attend services or identify with a religious group than they have at any time in recent memory.