In every field, there are artists who rise above the rest of us. I look up and marvel at the beauty of the stars. My Uncle Ned, who's a Harvard/University of Michigan-trained PhD astronomer, sees and marvels at astrophysical processes that go far beyond my surface appreciation of a pretty night sky. The same is true of art, music, cooking, botany, architecture, and almost every human endeavor.
William Safire, who died Sunday at the age of 79, was an artist in the field of language. And his voice will be missed.
Artists and experts don't just know more about their subjects; they actually see them differently. Which has its pros and cons. Painters don't just see objects, they see a mix of light and shadow. Ever since becoming a pilot, I no longer see clouds as just pretty puffy things in the sky. I see high Cirrus, which means I may need to depart earlier than planned, because a change in the weather is coming. A breezy day isn't just nice. It means turbulence in the pass.
The ability to appreciate far more layers of detail means that far more details can also irritate. My brother David can appreciate many more fine points of a symphony than I can. But he'll also be bothered by the fact that the horns in the second movement came in just a tad too late. And once you understand the technical elements of a subject, it can be hard to look at it without that magnifying lens. There's a line near the end of the movie Men in Black where the Tommy Lee Jones character, about to have his memories of aliens erased, says that it's going to be nice to be able to look up and see just a beautiful starry sky again.
As they delve further into the details of their art, artists also run the risk of getting lost in their own personal forest of specifics and language, leaving the rest of us too far behind to follow. Which is fine, as long as you don't care about communicating any of your ideas or the wonder of your discoveries to the rest of the world. I interviewed a NASA scientist once who insisted that to say the satellite he'd worked on had a near-equatorial orbit was untrue. It was, he said, a low-to-mid-inclination orbit. I explained that the book was for a lay person audience, and most people didn't inherently know what a low-to-mid-inclination orbit was, unless we explained it further. "Well, any intelligent person knows!" he exclaimed.
The same possibility exists with language. There are purists who, I suspect, are writing more for their own enjoyment than the comprehension of the audience. They're in love with multi-syllabic words, even if only six people in the audience know or can envision what those words mean. Not that there's anything inherently wrong with that, any more than with a jazz artist who cares more about reaching the pinnacle of intense self-expression than commercial success. In fact, I think it's important to have some purists out there, if only to remind the rest of us that the world contains magnificent mountains beyond the familiar, local hills we see and use everyday. It's just important to be clear about the goal, and be okay with the consequences of your choices.
William Safire was fascinated almost to the point of obsession with the details of words, leading to many arcane debates with his readers over seeming minutiae of nuanced word origin, usage or meaning. Live by the sword, die by the sword. And there were undoubtedly times when his own love of little-known words kept readers away from the ideas he was expressing. But he also asked and explored thought-provoking questions--including, inthis 2008 blog entry, whether perhaps Pliny the Younger was the world's first real blogger. And in a world where the instant-word-factory-assembly-line crunch of blogging and email, and the word-annihilation of texting and Twitter (LOL if u no wht i mean), the presence of those who still love, explore, and use the full depths and twists of the English Language--or any language--becomes even more important if the art is not to die out.
I'm not a purist of language; I'm as concerned with getting the point across as I am with the beauty of the words I put together to do it. But I am still a practitioner of the art; a member of the symphony, if not its artiste solo perfectionist. And so I truly appreciate those whose passion skill and knowledge act as a beacon for the rest of us, pulling us further along than we otherwise would have gone.
Ammon Shea, a dedicated word-lover, wrote a book last year about the year he spent reading the Oxford English Dictionary, cover to cover. (Reading the OED: One Man, One Year, 21,730 Pages). His obsession for the task drove those around him nuts, and I can't say as I have the passion required to follow in his footsteps. But I loved his book, and all the discoveries he allowed me to share. To think! There's actually a word for a fear of dinner parties! Who knew?
I also don't have the stamina of a William Manchester, whose biography of Winston Churchill stretched over three volumes--the last of which had to be completed by someone else, because Manchester suffered a series of strokes that left him, in his last years, unable to write. In commenting on the tragedy of a man whose life's work was the loving caress of words having lost his ability to find them, essayist Roger Rosenblatt recited one of Manchester's passages about Churchill's funeral:
"When his flag-draped coffin moved slowly across the old capital, drawn by naval ratings, and bare-headed Londoners stood trembling in the cold, they mourned, not only him and all he had meant, but all they had been and no longer were, and would never be again."
Manchester, Rosenblatt noted, most likely "had only the scantiest idea where that sentence would end when he began it. Only when he caught up with it could he know. But then, there was another sentence running ahead of him. There was always another sentence. And now there isn't." I still look at Manchester's words ... and Rosenblatt's framing of them ... and feel as if I've been blessed with a combination of master performances so beautiful and perfect that if it they'd been played out in a concert hall, I would have shouted aloud, "Bravo!"
Communication doesn't have to be taught. We learn it instinctively as small children. But the art of language; exploring words and crafting them together with rhythm, poetry, and meaning, is a learned and practiced skill that few ever master as well as Manchester, Keats, Shakespeare, or Safire. Like master chefs, musicians, athletes or scientists, they show us what's possible, and add a layer of nuanced beauty to a sometimes overly practical world.
I didn't always agree with Safire's detailed focus or opinions. And sometimes his immersion in his art may have stood in the way of getting his point across to a broader audience. But maybe, the point he really wanted to get across was simply how much richness there was in this language we use everyday, if only we'd take the time to explore and savor the forest with a little more attention and depth. And on that point, his message was inimitably, powerfully, and exceptionally clear.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
The Fourth of July—a time we Americans set aside to celebrate our independence and mark the war we waged to achieve it, along with the battles that followed. There was the War of 1812, the War of 1833, the First Ohio-Virginia War, the Three States' War, the First Black Insurrection, the Great War, the Second Black Insurrection, the Atlantic War, the Florida Intervention.
Confused? These are actually conflicts invented for the novel The Disunited States of Americaby Harry Turtledove, a prolific (and sometimes-pseudonymous) author of alternate histories with a Ph.D. in Byzantine history. The book is set in the 2090s in an alternate United States that is far from united. In fact, the states, having failed to ratify a constitution following the American Revolution, are separate countries that oscillate between cooperating and warring with one another, as in Europe.
As the world frets over Greece, a separate crisis looms in China.
This summer has not been calm for the global economy. In Europe, a Greek referendum this Sunday may determine whether the country will remain in the eurozone. In North America, meanwhile, the governor of Puerto Rico claimed last week that the island would be unable to pay off its debts, raising unsettling questions about the health of American municipal bonds.
But the season’s biggest economic crisis may be occurring in Asia, where shares in China’s two major stock exchanges have nosedived in the past three weeks. Since June 12, the Shanghai stock exchange has lost 24 percent of its value, while the damage in the southern city of Shenzhen has been even greater at 30 percent. The tumble has already wiped out more than $2.4 trillion in wealth—a figure roughly 10 times the size of Greece’s economy.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
The executive producer of Masterpiece says Jane Austen works a lot better on screen than Hemingway does.
For 44 years, PBS’s Masterpiece franchise has brought high-end Britain TV programs to American audiences. While the ultra-successful Downton Abbey comes from an original screenplay, many of Masterpiece’s shows over the years have been adapted from great works of literature. And the vast majority of those great works of literature, unsurprisingly, have been British.
But every so often, an American novel—like James Agee’s A Death in the Family or Willa Cather’s The Song of the Lark—has been turned into a Masterpiece. On Friday at the Aspen Ideas Festival, Rebecca Eaton, the longtime executive producer of Masterpiece, said she wished that the program had tackled more U.S. authors over the years. “The reasons that we haven't are twofold,” she said. “One is money, the second is money. And the third is money. Also, the dark nature of American literature, which is something to think about for a moment."
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.