To hear the subjects talk about it, you'd think they'd discovered a new law of physics. Shocking as it might seem, it turns out that if you're not live-blogging or tweeting or reporting on an event as it unfolds--and you know that nobody is going to quote you or write about the event after the fact--you act and experience an event quite differently. "You actually listen to the conversation, not just wait for your turn to speak," marveled a blogger who, the New York Times reported in a recent article, has begun organizing strictly off-the-record gatherings in New York.
The difference between reporting an event (or performing for reporters) and simply experiencing an event or conversation is one journalists learn very early. Twenty years ago, when I got my first job as an aviation journalist and was assigned to cover air shows, my pilot friends were green with envy. I was going to get paid for what they'd do for free. But reporting on an event is very, very different from simply experiencing it. It turns you from a participant into an observer. You have to step back and gauge what facts are the most important to gather; what story line you're going to pursue. The same is true for photographers. The eye of a photographer is an analytical one; judging the best angle, best light, best focal length in a world reduced to whatever narrow slice is visible through a view-finder. Real-time impressions and emotion are sacrificed for a lasting, illustrative image. There is a cost to recording an event; a cost paid in removal from full immersion or enjoyment of the moment as it actually happens.
By the same token, there's a night-and-day difference between a public versus a private conversation. Any journalist, spokesperson or politician could tell you that. So could anyone who has spent much time watching cable television pundits talk past each other in mind-numbing diatribes sparked by the presence of cameras and their attendant promise of publicity and notoriety.
But it used to be that only certain professionals understood, or had to struggle with, the downsides of either recording events, or being the subject of a recorded event. Now, almost everyone is getting a taste of it, thanks to the prolific spread of blogging, Facebook, twitter and other mass publication vehicles. And--thanks God, as my Italian neighbors would say--some of them are beginning to realize that the sword has two sides. That perhaps not everything has to be, or even should be, transmitted instantly into the universal, public realm.
There are many things to applaud about a world where more people can have voices and communication among groups is easier. Twitter proved an invaluable tool in aiding the protesters in Iran with regards to public gatherings and police movements. And the mass proliferation of blogs reminds me, at least in some ways, of the dawning of the cable television era, when niche groups suddenly found programming targeted specifically to them.
There's also nothing inherently evil about any of the new communication methods or technologies. But most advances come with some kind of tradeoff, and anything used to excess begins to be problematic--as some people are beginning to find out with an all-shared, all-the-time, lifestyle. One clear issue is preservation of privacy--especially for those who didn't volunteer to be part of a global chat-room. But living a reported life 24 hours a day also has a cost, not only in terms of the type and quality of the interactions and conversations it allows, but also in terms of how present the reporters are in any given moment.
It's possible to both experience and report an event, but not instantaneously or simultaneously. When I had an assignment to fly a a U-2 spy plane last fall, high enough to see the curvature of the earth, I got so preoccupied with taking photos and notes that I realized, part-way through the flight, that I wasn't actually experiencing any of it with any real depth. And to write anything of substance, I needed to first experience something of substance. So I turned off my intercom microphone, put the camera down, and just sat for a while. Looked out the window. Focused on what my senses were experiencing. Let my mind wander and my eyes drink in my surroundings. And in the richness of that silence, impressions softly bloomed. Of how fragile the world's atmosphere appeared. How being that high above the earth felt as if we were surreptitious invaders at the edge of a foreign realm ruled by powerful titans who needed no heat, air pressure or oxygen to survive. Of how lonely even a beautiful planet would be without anyone to welcome you home again.
And even those thoughts, put so cleanly into words here, took time to process and ferment, once I came back to earth.
New technology is all well and good. But the fact remains that it's tough to talk and listen at the same time, or be connected to an outside audience (even that of posterity) and still be fully immersed in the place, time, and dynamic of where you are. And that's a long-standing law of physics--or at least of human neuroscience and psychology--that I doubt any new technology is likely to overcome.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
A yearlong investigation of Greek houses reveals their endemic, lurid, and sometimes tragic problems—and a sophisticated system for shifting the blame.
One warm spring night in 2011, a young man named Travis Hughes stood on the back deck of the Alpha Tau Omega fraternity house at Marshall University, in West Virginia, and was struck by what seemed to him—under the influence of powerful inebriants, not least among them the clear ether of youth itself—to be an excellent idea: he would shove a bottle rocket up his ass and blast it into the sweet night air. And perhaps it was an excellent idea. What was not an excellent idea, however, was to misjudge the relative tightness of a 20-year-old sphincter and the propulsive reliability of a 20-cent bottle rocket. What followed ignition was not the bright report of a successful blastoff, but the muffled thud of fire in the hole.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
Mary Beard’s sweeping history is a new read of citizenship in the ancient empire.
A british college student named Megan Beech recently published a poetry collection called When I Grow Up I Want to Be Mary Beard. Beech is not alone in her admiration for Beard, who was for a time the only female classics lecturer at Cambridge University and has since become the most prominent representative of a field once associated with dusty male privilege. In 2013, Beard was appointed to the Order of the British Empire for “services to Classical Scholarship.” A prolific authority on Roman culture, she construes those services broadly. Her academic work ranges from studies of Roman religion and Roman victory practices to reflections on Roman laughter, and she has written lively books about Pompeii and the Colosseum. As the erudite docent on a BBC series three years ago titled Meet the Romans, Beard introduced a bigger audience to a bigger Rome: a citizenry far beyond the handful of Latin-speaking men who populated the Senate, served as emperors, or wrote (often dictating to their slaves) the books that we call “Roman literature.” Whatever the context (she also writes a blog, “A Don’s Life,” for the Times Literary Supplement), Beard does precisely what few popularizers dare to try and plenty of dons can’t pull off: She conveys the thrill of puzzling over texts and events that are bound to be ambiguous, and she complicates received wisdom in the process.
An entire industry has been built on the premise that creating gourmet meals at home is simple and effortless. But it isn’t true.
I write about food for a living. Because of this, I spend more time than the average American surrounded by cooking advice and recipes. I’m also a mother, which means more often than not, when I return from work 15 minutes before bedtime, I end up feeding my 1-year-old son squares of peanut-butter toast because there was nothing in the fridge capable of being transformed into a wholesome, homemade toddler meal in a matter of minutes. Every day, when I head to my office after a nourishing breakfast of smashed blueberries or oatmeal I found stuck to the pan, and open a glossy new cookbook, check my RSS feed, or page through a stack of magazines, I’m confronted by an impenetrable wall of unimaginable cooking projects, just sitting there pretending to be totally reasonable meals. Homemade beef barbacoa tacos. Short-rib potpie. “Weekday” French toast. Make-ahead coconut cake. They might as well be skyscraper blueprints, so improbable is the possibility that I will begin making my own nut butters, baking my own sandwich bread, or turning that fall farmer’s market bounty into jars of homemade applesauce.
Retailers are experimenting with a bold new strategy for the commercial high holiday: boycotting themselves.
It starts with a scene of touch football in the yard. Next, a woman and a girl, cooking together in the kitchen. “Imagine a world,” a soothing voice intones, “where the only thing you have to wrestle for on Thanksgiving is the last piece of pumpkin pie, and the only place we camped out was in front of a fire, and not the parking lot of a store.” And, then, more scenes: a man, cuddling with kids on a couch. An older woman, rolling pie dough on the counter. A fire, crackling in the fireplace. Warmth. Wine. Togetherness. Laughter.
It’s an ad, unsurprisingly, but it’s an ad with a strange objective: to tell you not to buy stuff. Or, at least, to spend a day not buying stuff. “At T.J. Maxx, Marshall’s, and HomeGoods, we’re closed on Thanksgiving,” the spot’s velvet-voiced narrator informs us, “because family time comes first.” And then: more music. More scenes of familiar/familial delights. More laughter. More pie. The whole thing concludes: “Let’s put more value on what really matters. This season, bring back the holidays—with T.J. Maxx, Marshall’s, and HomeGoods.”
Why trying to think like the Islamic State is so hard—and risky.
In killing 130 civilians in Paris—the worst such attack in France since World War II—ISIS has forced us to contend, once again, with the question of the “rationality” of self-professed ideologues. Since it wrested the world’s attention with its capture of Iraq’s second-largest city in June 2014, the extremist group has prioritized state-building over fighting far enemies abroad. This is what distinguished ISIS: It wasn’t just, or even primarily, a terrorist organization. It had an unusually pronounced interest in governance. As Yale University’s Andrew March and Mara Revkin lay out in considerable detail, the group focused its energy on developing fairly elaborate institutional structures in the territory it controlled within Iraq and Syria. ISIS wasn’t simply making things up as it went along. It may have been mad, but there was a method to the madness.