In every field, there are artists who rise above the rest of us. I look up and marvel at the beauty of the stars. My Uncle Ned, who's a Harvard/University of Michigan-trained PhD astronomer, sees and marvels at astrophysical processes that go far beyond my surface appreciation of a pretty night sky. The same is true of art, music, cooking, botany, architecture, and almost every human endeavor.
William Safire, who died Sunday at the age of 79, was an artist in the field of language. And his voice will be missed.
Artists and experts don't just know more about their subjects; they actually see them differently. Which has its pros and cons. Painters don't just see objects, they see a mix of light and shadow. Ever since becoming a pilot, I no longer see clouds as just pretty puffy things in the sky. I see high Cirrus, which means I may need to depart earlier than planned, because a change in the weather is coming. A breezy day isn't just nice. It means turbulence in the pass.
The ability to appreciate far more layers of detail means that far more details can also irritate. My brother David can appreciate many more fine points of a symphony than I can. But he'll also be bothered by the fact that the horns in the second movement came in just a tad too late. And once you understand the technical elements of a subject, it can be hard to look at it without that magnifying lens. There's a line near the end of the movie Men in Black where the Tommy Lee Jones character, about to have his memories of aliens erased, says that it's going to be nice to be able to look up and see just a beautiful starry sky again.
As they delve further into the details of their art, artists also run the risk of getting lost in their own personal forest of specifics and language, leaving the rest of us too far behind to follow. Which is fine, as long as you don't care about communicating any of your ideas or the wonder of your discoveries to the rest of the world. I interviewed a NASA scientist once who insisted that to say the satellite he'd worked on had a near-equatorial orbit was untrue. It was, he said, a low-to-mid-inclination orbit. I explained that the book was for a lay person audience, and most people didn't inherently know what a low-to-mid-inclination orbit was, unless we explained it further. "Well, any intelligent person knows!" he exclaimed.
The same possibility exists with language. There are purists who, I suspect, are writing more for their own enjoyment than the comprehension of the audience. They're in love with multi-syllabic words, even if only six people in the audience know or can envision what those words mean. Not that there's anything inherently wrong with that, any more than with a jazz artist who cares more about reaching the pinnacle of intense self-expression than commercial success. In fact, I think it's important to have some purists out there, if only to remind the rest of us that the world contains magnificent mountains beyond the familiar, local hills we see and use everyday. It's just important to be clear about the goal, and be okay with the consequences of your choices.
William Safire was fascinated almost to the point of obsession with the details of words, leading to many arcane debates with his readers over seeming minutiae of nuanced word origin, usage or meaning. Live by the sword, die by the sword. And there were undoubtedly times when his own love of little-known words kept readers away from the ideas he was expressing. But he also asked and explored thought-provoking questions--including, inthis 2008 blog entry, whether perhaps Pliny the Younger was the world's first real blogger. And in a world where the instant-word-factory-assembly-line crunch of blogging and email, and the word-annihilation of texting and Twitter (LOL if u no wht i mean), the presence of those who still love, explore, and use the full depths and twists of the English Language--or any language--becomes even more important if the art is not to die out.
I'm not a purist of language; I'm as concerned with getting the point across as I am with the beauty of the words I put together to do it. But I am still a practitioner of the art; a member of the symphony, if not its artiste solo perfectionist. And so I truly appreciate those whose passion skill and knowledge act as a beacon for the rest of us, pulling us further along than we otherwise would have gone.
Ammon Shea, a dedicated word-lover, wrote a book last year about the year he spent reading the Oxford English Dictionary, cover to cover. (Reading the OED: One Man, One Year, 21,730 Pages). His obsession for the task drove those around him nuts, and I can't say as I have the passion required to follow in his footsteps. But I loved his book, and all the discoveries he allowed me to share. To think! There's actually a word for a fear of dinner parties! Who knew?
I also don't have the stamina of a William Manchester, whose biography of Winston Churchill stretched over three volumes--the last of which had to be completed by someone else, because Manchester suffered a series of strokes that left him, in his last years, unable to write. In commenting on the tragedy of a man whose life's work was the loving caress of words having lost his ability to find them, essayist Roger Rosenblatt recited one of Manchester's passages about Churchill's funeral:
"When his flag-draped coffin moved slowly across the old capital, drawn by naval ratings, and bare-headed Londoners stood trembling in the cold, they mourned, not only him and all he had meant, but all they had been and no longer were, and would never be again."
Manchester, Rosenblatt noted, most likely "had only the scantiest idea where that sentence would end when he began it. Only when he caught up with it could he know. But then, there was another sentence running ahead of him. There was always another sentence. And now there isn't." I still look at Manchester's words ... and Rosenblatt's framing of them ... and feel as if I've been blessed with a combination of master performances so beautiful and perfect that if it they'd been played out in a concert hall, I would have shouted aloud, "Bravo!"
Communication doesn't have to be taught. We learn it instinctively as small children. But the art of language; exploring words and crafting them together with rhythm, poetry, and meaning, is a learned and practiced skill that few ever master as well as Manchester, Keats, Shakespeare, or Safire. Like master chefs, musicians, athletes or scientists, they show us what's possible, and add a layer of nuanced beauty to a sometimes overly practical world.
I didn't always agree with Safire's detailed focus or opinions. And sometimes his immersion in his art may have stood in the way of getting his point across to a broader audience. But maybe, the point he really wanted to get across was simply how much richness there was in this language we use everyday, if only we'd take the time to explore and savor the forest with a little more attention and depth. And on that point, his message was inimitably, powerfully, and exceptionally clear.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
As incomes fall across the nation, even better-off areas like Sheboygan County, Wisconsin, are faltering.
SHEBOYGAN, Wisc.—There is still a sizable middle class in this county of 115,000 on the shores of Lake Michigan, a pleasant hour’s drive from Milwaukee. You can see it in the cars that pour in and out of the parking lots of local factories, in the restaurants packed with older couples on weeknights, and in the bars that seem to be on every single corner. You can see it in the local parks, including one called Field of Dreams, where kids play soccer and baseball and their parents sit and watch.
About 63 percent of adults in Sheboygan make between $41,641 and $124,924, meaning the area has one of the highest shares of middle-class households in the country, according to a report from the Pew Research Center. Nationally, only 51 percent of adults are middle-class.
The Model S’s Autopilot isn’t technically a driverless feature, but the federal investigation into why a driver using it was killed will still influence the future of driverless vehicles.
Federal officials are investigating a crash that killed the driver of a Model S, a Tesla vehicle with a partially autonomous driving system, in a move that has major implications for the future of driverless vehicles.
“This is the first known fatality in just over 130 million miles where Autopilot was activated …” Tesla wrote in a statement on Thursday. “It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.”
The investigation may be standard procedure, but it’s also certain to influence the ongoing conversation about the safety of self-driving vehicles.
The Model S isn’t technically a driverless car, but Tesla has been a vocal player in the race to bring truly driverless cars to market. The company’s Autopilot feature is an assistive technology, meaning that drivers are instructed to keep their hands on the wheel while using it—even though it is sophisticated enough to complete tasks like merging onto the highway. It wasn’t clear from Tesla’s statement how engaged the driver was at the time of the crash.