In every field, there are artists who rise above the rest of us. I look up and marvel at the beauty of the stars. My Uncle Ned, who's a Harvard/University of Michigan-trained PhD astronomer, sees and marvels at astrophysical processes that go far beyond my surface appreciation of a pretty night sky. The same is true of art, music, cooking, botany, architecture, and almost every human endeavor.
William Safire, who died Sunday at the age of 79, was an artist in the field of language. And his voice will be missed.
Artists and experts don't just know more about their subjects; they actually see them differently. Which has its pros and cons. Painters don't just see objects, they see a mix of light and shadow. Ever since becoming a pilot, I no longer see clouds as just pretty puffy things in the sky. I see high Cirrus, which means I may need to depart earlier than planned, because a change in the weather is coming. A breezy day isn't just nice. It means turbulence in the pass.
The ability to appreciate far more layers of detail means that far more details can also irritate. My brother David can appreciate many more fine points of a symphony than I can. But he'll also be bothered by the fact that the horns in the second movement came in just a tad too late. And once you understand the technical elements of a subject, it can be hard to look at it without that magnifying lens. There's a line near the end of the movie Men in Black where the Tommy Lee Jones character, about to have his memories of aliens erased, says that it's going to be nice to be able to look up and see just a beautiful starry sky again.
As they delve further into the details of their art, artists also run the risk of getting lost in their own personal forest of specifics and language, leaving the rest of us too far behind to follow. Which is fine, as long as you don't care about communicating any of your ideas or the wonder of your discoveries to the rest of the world. I interviewed a NASA scientist once who insisted that to say the satellite he'd worked on had a near-equatorial orbit was untrue. It was, he said, a low-to-mid-inclination orbit. I explained that the book was for a lay person audience, and most people didn't inherently know what a low-to-mid-inclination orbit was, unless we explained it further. "Well, any intelligent person knows!" he exclaimed.
The same possibility exists with language. There are purists who, I suspect, are writing more for their own enjoyment than the comprehension of the audience. They're in love with multi-syllabic words, even if only six people in the audience know or can envision what those words mean. Not that there's anything inherently wrong with that, any more than with a jazz artist who cares more about reaching the pinnacle of intense self-expression than commercial success. In fact, I think it's important to have some purists out there, if only to remind the rest of us that the world contains magnificent mountains beyond the familiar, local hills we see and use everyday. It's just important to be clear about the goal, and be okay with the consequences of your choices.
William Safire was fascinated almost to the point of obsession with the details of words, leading to many arcane debates with his readers over seeming minutiae of nuanced word origin, usage or meaning. Live by the sword, die by the sword. And there were undoubtedly times when his own love of little-known words kept readers away from the ideas he was expressing. But he also asked and explored thought-provoking questions--including, inthis 2008 blog entry, whether perhaps Pliny the Younger was the world's first real blogger. And in a world where the instant-word-factory-assembly-line crunch of blogging and email, and the word-annihilation of texting and Twitter (LOL if u no wht i mean), the presence of those who still love, explore, and use the full depths and twists of the English Language--or any language--becomes even more important if the art is not to die out.
I'm not a purist of language; I'm as concerned with getting the point across as I am with the beauty of the words I put together to do it. But I am still a practitioner of the art; a member of the symphony, if not its artiste solo perfectionist. And so I truly appreciate those whose passion skill and knowledge act as a beacon for the rest of us, pulling us further along than we otherwise would have gone.
Ammon Shea, a dedicated word-lover, wrote a book last year about the year he spent reading the Oxford English Dictionary, cover to cover. (Reading the OED: One Man, One Year, 21,730 Pages). His obsession for the task drove those around him nuts, and I can't say as I have the passion required to follow in his footsteps. But I loved his book, and all the discoveries he allowed me to share. To think! There's actually a word for a fear of dinner parties! Who knew?
I also don't have the stamina of a William Manchester, whose biography of Winston Churchill stretched over three volumes--the last of which had to be completed by someone else, because Manchester suffered a series of strokes that left him, in his last years, unable to write. In commenting on the tragedy of a man whose life's work was the loving caress of words having lost his ability to find them, essayist Roger Rosenblatt recited one of Manchester's passages about Churchill's funeral:
"When his flag-draped coffin moved slowly across the old capital, drawn by naval ratings, and bare-headed Londoners stood trembling in the cold, they mourned, not only him and all he had meant, but all they had been and no longer were, and would never be again."
Manchester, Rosenblatt noted, most likely "had only the scantiest idea where that sentence would end when he began it. Only when he caught up with it could he know. But then, there was another sentence running ahead of him. There was always another sentence. And now there isn't." I still look at Manchester's words ... and Rosenblatt's framing of them ... and feel as if I've been blessed with a combination of master performances so beautiful and perfect that if it they'd been played out in a concert hall, I would have shouted aloud, "Bravo!"
Communication doesn't have to be taught. We learn it instinctively as small children. But the art of language; exploring words and crafting them together with rhythm, poetry, and meaning, is a learned and practiced skill that few ever master as well as Manchester, Keats, Shakespeare, or Safire. Like master chefs, musicians, athletes or scientists, they show us what's possible, and add a layer of nuanced beauty to a sometimes overly practical world.
I didn't always agree with Safire's detailed focus or opinions. And sometimes his immersion in his art may have stood in the way of getting his point across to a broader audience. But maybe, the point he really wanted to get across was simply how much richness there was in this language we use everyday, if only we'd take the time to explore and savor the forest with a little more attention and depth. And on that point, his message was inimitably, powerfully, and exceptionally clear.
As Coldplay blandly strained for the universal, she and Bruno Mars pulled off something more specific and more daring.
What a perfect Beyoncésong name: “Formation.” All great pop involves people acting in formation. So does all great change. And while fans scream that Beyoncé’s a “queen” and “goddess,” her core appeal really is as a drill sergeant. With Beyoncé in command, greatness is scalable, achievable, for the collective. Everyone waves their hands to the same beat. Everyone walks around like they have hot sauce in their bag.
But in pop and in politics, “everyone” is a loaded term. Stars as ubiquitous as Beyoncé have haters, the “albino alligators” who “Formation” informs us she twirls upon. And in a more general historical sense, “everyone” can be a dangerous illusion that elevates one point of view as universal while minimizing others. Beyoncé gets all of this, it seems. As a pop star, she surely wants to have as broad a reach as possible. But as an artist, she has a specific message, born of a specific experience, meaningful to specific people. Rather than pretend otherwise, she’s going to make art about the tension implied by this dynamic. She’s going to show up to Super Bowl with a phalanx of women dressed as Black Panthers.
Black poverty is fundamentally distinct from white poverty—and so cannot be addressed without grappling with racism.
There have been a number of useful entries in the weeks since Senator Bernie Sanders declared himself against reparations. Perhaps the most clarifying comes from Cedric Johnson in a piece entitled, “An Open Letter To Ta-Nehisi Coates And The Liberals Who Love Him.” Johnson’s essay offers those of us interested in the problem of white supremacy and the question of economic class the chance to tease out how, and where, these two problems intersect. In Johnson’s rendition, racism, in of itself, holds limited explanatory power when looking at the socio-economic problems which beset African Americans. “We continue to reach for old modes of analysis in the face of a changed world,” writes Johnson. “One where blackness is still derogated but anti-black racism is not the principal determinant of material conditions and economic mobility for many African Americans.”
For decades the Man of Steel has failed to find his groove, thanks to a continual misunderstanding of his strengths.
Superman should be invincible. Since his car-smashing debut in 1938, he’s starred in at least one regular monthly comic, three blockbuster films, and four television shows. His crest is recognized across the globe, his supporting cast is legendary, and anybody even vaguely familiar with comics can recount the broad strokes of his origin. (The writer Grant Morrison accomplished it in eight words: “Doomed Planet. Desperate Scientists. Last Hope. Kindly Couple.”) He’s the first of the superheroes, a genre that’s grown into a modern mass-media juggernaut.
And yet, for a character who gains his power from the light of the sun, Superman is curiously eclipsed by other heroes. According to numbers provided by Diamond Distributors, the long-running Superman comic sold only 55,000 copies a month in 2015, down from around 70,000 in 2010—a mediocre showing even for the famously anemic comic-book market. That’s significantly less than his colleague Batman, who last year moved issues at a comparatively brisk 150,000 a month. Mass media hasn’t been much kinder: The longest-running Superman television show, 2001’s Smallville, kept him out of his iconic suit for a decade. Superman Returns recouped its budget at the box office, but proved mostly forgettable.2013’s Man of Steel drew sharp criticism from critics and audiences alike for its bleak tone and rampaging finale. Trailers for the sequel, Batman v Superman: Dawn of Justice, have shifted the focus (and top billing) to the Dark Knight. Worst of all, conventional wisdom puts the blame on Superman himself. He’s boring, people say; he’s unrelatable, nothing like the Marvel characters dominating the sales charts and the box office. More than anything, he seems embarrassing. Look at him. Truth? Justice? He wears his underwear on the outside.
Will the Democratic Party nominate a candidate who hasn’t been a member of their party, and who has long denounced it?
When a party chooses its presidential candidate, it also chooses its party leader in the election. This year the Democrats face an unusual situation. Bernie Sanders isn’t just an outsider to the party establishment; he’s not even been a member of the party, and has long excoriated it in unsparing language. Although the media haven’t much focused on this history, the early signs suggest it could become a problem for Sanders in getting the nomination—and a problem for the party if he does get it.
According to the entrance polls at the Iowa caucuses, there was a 30-percentage-point split between self-identified Democrats and independents in their support for Sanders. Hillary Clinton won 56 percent of self-identified Democrats but only 26 percent of independents, while Sanders won only 39 percent of Democrats but 69 percent of independents.
Immediately, the pings from fellow journalists (and media-adjacent folk) came pouring in, all saying something along the lines of, “Can you actually let me know what you find out? I’m addicted to that stuff.”
They mean “addicted” in the jokey, dark-chocolate-and-Netflix-streaming way, but the habit can border on pathological. For me, rock bottom was a recent, obscenely long workday during which an entire 12-pack of coconut La Croix somehow made it down my throat, can by shining can.
In Homs, Syria, where entire city blocks have been reduced to rubble by years of civil war, a Syrian wedding photographer thought of using the destruction of the city as a backdrop for pictures of newlywed couples “to show that life is stronger than death.”
In Homs, Syria, where entire city blocks have been reduced to rubble by years of civil war, a Syrian wedding photographer thought of using the destruction of the city as a backdrop for pictures of newlywed couples “to show that life is stronger than death,” according to AFP photographer Joseph Eid. Here, Nada Merhi, 18, and her husband, Syrian army soldier Hassan Youssef, 27, pose for a series of wedding pictures amid heavily damaged buildings in Homs on February 5, 2016.
Humbled by his struggling presidential campaign, can the once-mighty New Jersey governor vault back into contention after Saturday’s debate?
SALEM, New Hampshire—Chris Christie was accustomed to being a big man: a man of stature, a man of power, a man who demands and gets his way.
But recently, the big man (this is a description of his personality, not his size) was seeming awfully small.
On Friday evening here, the governor of New Jersey was desperately trying to talk some sense into the people of New Hampshire, a couple hundred of whom had come out to see him on a snowy night. The night before, Christie’s rival Marco Rubio had played the same venue, filling a larger room of the elementary school beyond its capacity. Christie was begging the crowd not to pile on the bandwagon of the apparent winner, but instead, to show some courage.
The former president’s heated assault on Bernie Sanders is a reminder of how the Clintons have long reacted to any opposition.
One of my oldest Hillary Clinton memories: Twenty-six years ago, I stood in the second-floor rotunda of the Arkansas Capitol half-listening to a news conference by Tom McRae, an earnest Democrat challenging Governor Bill Clinton for re-election. Then I heard it: Click. Clack. Click. Clack. Click. Clack.
The sound of Hillary Clinton’s low-heeled shoes on a hidden marble hallway jarred McRae, who in 1990 was Bill Clinton’s biggest obstacle to a fifth term and a presidential bid two years later. The first lady of Arkansas rounded the corner and stormed his news conference. “Tom!” she shouted. “I think we oughta get the record straight!”
Waving a sheaf of papers, Hillary Clinton undercut McRae’s criticism of her husband’s record by pointing to McRae’s past praise of the governor. It was a brutal sandbagging. “Many of the reports you issued not only praise the governor on his environmental record,” she said, “but his education and his economic record!”
The Denver Broncos beat the Carolina Panthers, but neither Peyton Manning nor Cam Newton seemed able to prove their worth.
Now more than ever, the NFL is all about the quarterbacks. The buildup to Super Bowl 50 proved no exception: In the two weeks prior to Sunday night’s game in Santa Clara, the national conversation largely centered on the signal-callers, whose styles of play and off-field personas were pored over in every manner imaginable by an army of reporters and analysts. The game’s two possible outcomes were pre-cast as career-defining triumphs for the passers. If the Denver Broncos won, it would be a rousing sendoff for the potentially retiring all-time great Peyton Manning. If the Carolina Panthers won, it would be a coronation for Cam Newton, this season’s Most Valuable Player.
The Broncos beat the Panthers, 24-10, but the game featured none of the displays of virtuosity fans of Manning or Newton might have hoped for. It was a plodding, mistake-riddled affair, all stuffed runs and stalled drives. Maybe the most miraculous thing about the game was that it ended at all; it seemed for a time that it might simply give out somewhere along the way, leaving the Denver and Carolina players to wander around Levi’s Stadium until the resumption of football next fall.
One professor is borrowing a method from Harvard Business School to engage students and inspire better decision-making skills.
In a spacious classroom in Aldrich Hall on the Harvard Business School campus, 100 students are passionately discussing a case called “Battle Over a Bank.” But these aren’t MBA students deliberating over how much the government should regulate the financial sector. This group of mostly undergraduates, guided by the award-winning Harvard Business School professor David Moss, is diving into the fierce 1791 debate over whether the Constitution could be interpreted to allow the fledgling U.S. government the power to form a bank at all.
This class, “History of American Democracy,” is no pedestrian historical survey course. It uses the case method—the business school’s signature teaching technique—to immerse undergraduates (as well as a limited number of HBS students) in critical episodes in the development of American democracy.