To hear the subjects talk about it, you'd think they'd discovered a new law of physics. Shocking as it might seem, it turns out that if you're not live-blogging or tweeting or reporting on an event as it unfolds--and you know that nobody is going to quote you or write about the event after the fact--you act and experience an event quite differently. "You actually listen to the conversation, not just wait for your turn to speak," marveled a blogger who, the New York Times reported in a recent article, has begun organizing strictly off-the-record gatherings in New York.
The difference between reporting an event (or performing for reporters) and simply experiencing an event or conversation is one journalists learn very early. Twenty years ago, when I got my first job as an aviation journalist and was assigned to cover air shows, my pilot friends were green with envy. I was going to get paid for what they'd do for free. But reporting on an event is very, very different from simply experiencing it. It turns you from a participant into an observer. You have to step back and gauge what facts are the most important to gather; what story line you're going to pursue. The same is true for photographers. The eye of a photographer is an analytical one; judging the best angle, best light, best focal length in a world reduced to whatever narrow slice is visible through a view-finder. Real-time impressions and emotion are sacrificed for a lasting, illustrative image. There is a cost to recording an event; a cost paid in removal from full immersion or enjoyment of the moment as it actually happens.
By the same token, there's a night-and-day difference between a public versus a private conversation. Any journalist, spokesperson or politician could tell you that. So could anyone who has spent much time watching cable television pundits talk past each other in mind-numbing diatribes sparked by the presence of cameras and their attendant promise of publicity and notoriety.
But it used to be that only certain professionals understood, or had to struggle with, the downsides of either recording events, or being the subject of a recorded event. Now, almost everyone is getting a taste of it, thanks to the prolific spread of blogging, Facebook, twitter and other mass publication vehicles. And--thanks God, as my Italian neighbors would say--some of them are beginning to realize that the sword has two sides. That perhaps not everything has to be, or even should be, transmitted instantly into the universal, public realm.
There are many things to applaud about a world where more people can have voices and communication among groups is easier. Twitter proved an invaluable tool in aiding the protesters in Iran with regards to public gatherings and police movements. And the mass proliferation of blogs reminds me, at least in some ways, of the dawning of the cable television era, when niche groups suddenly found programming targeted specifically to them.
There's also nothing inherently evil about any of the new communication methods or technologies. But most advances come with some kind of tradeoff, and anything used to excess begins to be problematic--as some people are beginning to find out with an all-shared, all-the-time, lifestyle. One clear issue is preservation of privacy--especially for those who didn't volunteer to be part of a global chat-room. But living a reported life 24 hours a day also has a cost, not only in terms of the type and quality of the interactions and conversations it allows, but also in terms of how present the reporters are in any given moment.
It's possible to both experience and report an event, but not instantaneously or simultaneously. When I had an assignment to fly a a U-2 spy plane last fall, high enough to see the curvature of the earth, I got so preoccupied with taking photos and notes that I realized, part-way through the flight, that I wasn't actually experiencing any of it with any real depth. And to write anything of substance, I needed to first experience something of substance. So I turned off my intercom microphone, put the camera down, and just sat for a while. Looked out the window. Focused on what my senses were experiencing. Let my mind wander and my eyes drink in my surroundings. And in the richness of that silence, impressions softly bloomed. Of how fragile the world's atmosphere appeared. How being that high above the earth felt as if we were surreptitious invaders at the edge of a foreign realm ruled by powerful titans who needed no heat, air pressure or oxygen to survive. Of how lonely even a beautiful planet would be without anyone to welcome you home again.
And even those thoughts, put so cleanly into words here, took time to process and ferment, once I came back to earth.
New technology is all well and good. But the fact remains that it's tough to talk and listen at the same time, or be connected to an outside audience (even that of posterity) and still be fully immersed in the place, time, and dynamic of where you are. And that's a long-standing law of physics--or at least of human neuroscience and psychology--that I doubt any new technology is likely to overcome.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
25 years ago, Roseanne Barr sparked national fury when she delivered an off-key rendition in San Diego. But the reasons behind outrage and praise for various interpretations have as much to do with politics as musical talent.
On July 25, 1990, the comedian Roseanne Barr stood in San Diego’s Jack Murphy Stadium before a baseball game, grabbed a microphone behind home plate, and, with her shirttails hanging out and sleeves rolled up, barked out what many consider to be the most unpatriotic performance of “The Star-Spangled Banner” in history. In a screech not unlike a fork being scratched across slate, Barr garbled her way through the lyrics, missed notes intentionally, and capped off the whole affair by grabbing her crotch and spitting on the ground. In her defense, those final gestures were meant as a parody of ballplayers’ behavior, but many of the 27,285 paying fans didn’t see it that way. What they saw was utter disrespect for the national anthem, and thus, the country. She had exercised her freedom of speech, so they exercised their right to boo her off the field.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
On Sunday, citizens will vote on how to move forward in the country's financial crisis.
On Sunday, the people of Greece will help decide the financial future of their country. With the nation already in default and capital controls in place to prevent a run on the banks, it’s up to Greece’s citizens to decide what road the country will take from here.
The referendum—which asks Greeks to either vote yes or no to a current proposal from Eurogroup leaders to extend financing to the deeply indebted country— was called for by Greek Prime Minister Alexis Tsipras amid meetings of Eurozone leaders as they tried to come up with a deal that would allow the country to avoid default. The call for a vote effectively ended discussions.
Opponents of thecurrent proposal from the Eurogroup feel that the austerity measures put forth by the Eurogroup’s leaders—which would includes things like tax hikes, pension cuts, and reductions in government jobs—are overly harsh and punitive, and could hurt Greeks more than help them.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The retired general and former CIA director holds forth on the Middle East.
ASPEN, Colo.—Retired U.S. Army General David Petraeus pioneered America’s approach to counterinsurgency, led the surge in Iraq, served as director of the CIA for a year, and was sentenced to two years probation for leaking classified information to his mistress. On Wednesday at the Aspen Ideas Festival, he was interviewed by my colleague, Jeffrey Goldberg, about subjects including efforts to stop Iran’s nuclear program; the civil war in Syria; ISIS and the threat it poses to the United States; and the Iraq War.
Here are several noteworthy moments from their conversation, slightly condensed:
The Risks of Attacking Iran
Jeffrey Goldberg: So you believe that, under certain circumstances, President Obama would still use military force against Iran?
David Petraeus: I think he would, actually. I know we’ve had red lines that didn’t turn out to be red lines. ... I think this is a different issue, and I clearly recognize how the administration has sought to show that this is very, very different from other sort of off-the-cuff remarks.
Goldberg: How did the Obama administration stop Israel from attacking Iran? And do you think that if this deal does go south, that Israel would be back in the picture?
Petraeus: I don’t, actually. I think Israel is very cognizant of its limitations. ... The Israelis do not have anything that can crack this deeply buried enrichment site ... and if you cannot do that, you’re not going to set the program back very much. So is it truly worth it, then?
So that’s a huge limitation. It’s also publicly known that we have a 30,000-pound projectile that no one else has, that no one else can even carry. The Massive Ordinance Penetrator was under design for almost six years. ... If necessary, we can take out all these facilities and set them back a few years, depending on your assumptions.
But that’s another roll of the iron dice, as Bismarck used to say, and you never know when those dice are rolled what the outcome is going to be. You don’t know what risks could materialize for those who are in harm’s way.
You don’t know what the response could be by Iran.
There’s always the chance that there will be salvos at Israel, but what if they decide to go at the Gulf states, where we have facilities in every single one.
This is not something to be taken lightly, clearly.