To hear the subjects talk about it, you'd think they'd discovered a new law of physics. Shocking as it might seem, it turns out that if you're not live-blogging or tweeting or reporting on an event as it unfolds--and you know that nobody is going to quote you or write about the event after the fact--you act and experience an event quite differently. "You actually listen to the conversation, not just wait for your turn to speak," marveled a blogger who, the New York Times reported in a recent article, has begun organizing strictly off-the-record gatherings in New York.
The difference between reporting an event (or performing for reporters) and simply experiencing an event or conversation is one journalists learn very early. Twenty years ago, when I got my first job as an aviation journalist and was assigned to cover air shows, my pilot friends were green with envy. I was going to get paid for what they'd do for free. But reporting on an event is very, very different from simply experiencing it. It turns you from a participant into an observer. You have to step back and gauge what facts are the most important to gather; what story line you're going to pursue. The same is true for photographers. The eye of a photographer is an analytical one; judging the best angle, best light, best focal length in a world reduced to whatever narrow slice is visible through a view-finder. Real-time impressions and emotion are sacrificed for a lasting, illustrative image. There is a cost to recording an event; a cost paid in removal from full immersion or enjoyment of the moment as it actually happens.
By the same token, there's a night-and-day difference between a public versus a private conversation. Any journalist, spokesperson or politician could tell you that. So could anyone who has spent much time watching cable television pundits talk past each other in mind-numbing diatribes sparked by the presence of cameras and their attendant promise of publicity and notoriety.
But it used to be that only certain professionals understood, or had to struggle with, the downsides of either recording events, or being the subject of a recorded event. Now, almost everyone is getting a taste of it, thanks to the prolific spread of blogging, Facebook, twitter and other mass publication vehicles. And--thanks God, as my Italian neighbors would say--some of them are beginning to realize that the sword has two sides. That perhaps not everything has to be, or even should be, transmitted instantly into the universal, public realm.
There are many things to applaud about a world where more people can have voices and communication among groups is easier. Twitter proved an invaluable tool in aiding the protesters in Iran with regards to public gatherings and police movements. And the mass proliferation of blogs reminds me, at least in some ways, of the dawning of the cable television era, when niche groups suddenly found programming targeted specifically to them.
There's also nothing inherently evil about any of the new communication methods or technologies. But most advances come with some kind of tradeoff, and anything used to excess begins to be problematic--as some people are beginning to find out with an all-shared, all-the-time, lifestyle. One clear issue is preservation of privacy--especially for those who didn't volunteer to be part of a global chat-room. But living a reported life 24 hours a day also has a cost, not only in terms of the type and quality of the interactions and conversations it allows, but also in terms of how present the reporters are in any given moment.
It's possible to both experience and report an event, but not instantaneously or simultaneously. When I had an assignment to fly a a U-2 spy plane last fall, high enough to see the curvature of the earth, I got so preoccupied with taking photos and notes that I realized, part-way through the flight, that I wasn't actually experiencing any of it with any real depth. And to write anything of substance, I needed to first experience something of substance. So I turned off my intercom microphone, put the camera down, and just sat for a while. Looked out the window. Focused on what my senses were experiencing. Let my mind wander and my eyes drink in my surroundings. And in the richness of that silence, impressions softly bloomed. Of how fragile the world's atmosphere appeared. How being that high above the earth felt as if we were surreptitious invaders at the edge of a foreign realm ruled by powerful titans who needed no heat, air pressure or oxygen to survive. Of how lonely even a beautiful planet would be without anyone to welcome you home again.
And even those thoughts, put so cleanly into words here, took time to process and ferment, once I came back to earth.
New technology is all well and good. But the fact remains that it's tough to talk and listen at the same time, or be connected to an outside audience (even that of posterity) and still be fully immersed in the place, time, and dynamic of where you are. And that's a long-standing law of physics--or at least of human neuroscience and psychology--that I doubt any new technology is likely to overcome.
“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”
Last Sunday the host of a popular news show asked me what it meant to lose my body. The host was broadcasting from Washington, D.C., and I was seated in a remote studio on the far west side of Manhattan. A satellite closed the miles between us, but no machinery could close the gap between her world and the world for which I had been summoned to speak. When the host asked me about my body, her face faded from the screen, and was replaced by a scroll of words, written by me earlier that week.
The host read these words for the audience, and when she finished she turned to the subject of my body, although she did not mention it specifically. But by now I am accustomed to intelligent people asking about the condition of my body without realizing the nature of their request. Specifically, the host wished to know why I felt that white America’s progress, or rather the progress of those Americans who believe that they are white, was built on looting and violence. Hearing this, I felt an old and indistinct sadness well up in me. The answer to this question is the record of the believers themselves. The answer is American history.
As the world frets over Greece, a separate crisis looms in China.
This summer has not been calm for the global economy. In Europe, a Greek referendum this Sunday may determine whether the country will remain in the eurozone. In North America, meanwhile, the governor of Puerto Rico claimed last week that the island would be unable to pay off its debts, raising unsettling questions about the health of American municipal bonds.
But the season’s biggest economic crisis may be occurring in Asia, where shares in China’s two major stock exchanges have nosedived in the past three weeks. Since June 12, the Shanghai stock exchange has lost 24 percent of its value, while the damage in the southern city of Shenzhen has been even greater at 30 percent. The tumble has already wiped out more than $2.4 trillion in wealth—a figure roughly 10 times the size of Greece’s economy.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
The Fourth of July—a time we Americans set aside to celebrate our independence and mark the war we waged to achieve it, along with the battles that followed. There was the War of 1812, the War of 1833, the First Ohio-Virginia War, the Three States' War, the First Black Insurrection, the Great War, the Second Black Insurrection, the Atlantic War, the Florida Intervention.
Confused? These are actually conflicts invented for the novel The Disunited States of Americaby Harry Turtledove, a prolific (and sometimes-pseudonymous) author of alternate histories with a Ph.D. in Byzantine history. The book is set in the 2090s in an alternate United States that is far from united. In fact, the states, having failed to ratify a constitution following the American Revolution, are separate countries that oscillate between cooperating and warring with one another, as in Europe.
Highlights from seven days of reading about entertainment
British Cinemas Need to Do Better for Black Audiences
Simran Hans | Buzzfeed
“The myth that black people don’t go to the cinema becomes a self-fulfilling prophecy, predicated on the assumption that cinemagoers are only interested in seeing themselves represented on screen. This seems to be at the heart of the problem.”
Hump Day: The Utterly OMG Magic Mike XXL
Wesley Morris | Grantland
“Not since the days of peak Travolta and Dirty Dancing has a film so perfectly nailed something essential about movie lust: Male vulnerability is hot, particularly when the man is dancing with and therefore for a woman. It aligns the entire audience with the complex prerogatives of female desire.”
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.