In every field, there are artists who rise above the rest of us. I look up and marvel at the beauty of the stars. My Uncle Ned, who's a Harvard/University of Michigan-trained PhD astronomer, sees and marvels at astrophysical processes that go far beyond my surface appreciation of a pretty night sky. The same is true of art, music, cooking, botany, architecture, and almost every human endeavor.
William Safire, who died Sunday at the age of 79, was an artist in the field of language. And his voice will be missed.
Artists and experts don't just know more about their subjects; they actually see them differently. Which has its pros and cons. Painters don't just see objects, they see a mix of light and shadow. Ever since becoming a pilot, I no longer see clouds as just pretty puffy things in the sky. I see high Cirrus, which means I may need to depart earlier than planned, because a change in the weather is coming. A breezy day isn't just nice. It means turbulence in the pass.
The ability to appreciate far more layers of detail means that far more details can also irritate. My brother David can appreciate many more fine points of a symphony than I can. But he'll also be bothered by the fact that the horns in the second movement came in just a tad too late. And once you understand the technical elements of a subject, it can be hard to look at it without that magnifying lens. There's a line near the end of the movie Men in Black where the Tommy Lee Jones character, about to have his memories of aliens erased, says that it's going to be nice to be able to look up and see just a beautiful starry sky again.
As they delve further into the details of their art, artists also run the risk of getting lost in their own personal forest of specifics and language, leaving the rest of us too far behind to follow. Which is fine, as long as you don't care about communicating any of your ideas or the wonder of your discoveries to the rest of the world. I interviewed a NASA scientist once who insisted that to say the satellite he'd worked on had a near-equatorial orbit was untrue. It was, he said, a low-to-mid-inclination orbit. I explained that the book was for a lay person audience, and most people didn't inherently know what a low-to-mid-inclination orbit was, unless we explained it further. "Well, any intelligent person knows!" he exclaimed.
The same possibility exists with language. There are purists who, I suspect, are writing more for their own enjoyment than the comprehension of the audience. They're in love with multi-syllabic words, even if only six people in the audience know or can envision what those words mean. Not that there's anything inherently wrong with that, any more than with a jazz artist who cares more about reaching the pinnacle of intense self-expression than commercial success. In fact, I think it's important to have some purists out there, if only to remind the rest of us that the world contains magnificent mountains beyond the familiar, local hills we see and use everyday. It's just important to be clear about the goal, and be okay with the consequences of your choices.
William Safire was fascinated almost to the point of obsession with the details of words, leading to many arcane debates with his readers over seeming minutiae of nuanced word origin, usage or meaning. Live by the sword, die by the sword. And there were undoubtedly times when his own love of little-known words kept readers away from the ideas he was expressing. But he also asked and explored thought-provoking questions--including, inthis 2008 blog entry, whether perhaps Pliny the Younger was the world's first real blogger. And in a world where the instant-word-factory-assembly-line crunch of blogging and email, and the word-annihilation of texting and Twitter (LOL if u no wht i mean), the presence of those who still love, explore, and use the full depths and twists of the English Language--or any language--becomes even more important if the art is not to die out.
I'm not a purist of language; I'm as concerned with getting the point across as I am with the beauty of the words I put together to do it. But I am still a practitioner of the art; a member of the symphony, if not its artiste solo perfectionist. And so I truly appreciate those whose passion skill and knowledge act as a beacon for the rest of us, pulling us further along than we otherwise would have gone.
Ammon Shea, a dedicated word-lover, wrote a book last year about the year he spent reading the Oxford English Dictionary, cover to cover. (Reading the OED: One Man, One Year, 21,730 Pages). His obsession for the task drove those around him nuts, and I can't say as I have the passion required to follow in his footsteps. But I loved his book, and all the discoveries he allowed me to share. To think! There's actually a word for a fear of dinner parties! Who knew?
I also don't have the stamina of a William Manchester, whose biography of Winston Churchill stretched over three volumes--the last of which had to be completed by someone else, because Manchester suffered a series of strokes that left him, in his last years, unable to write. In commenting on the tragedy of a man whose life's work was the loving caress of words having lost his ability to find them, essayist Roger Rosenblatt recited one of Manchester's passages about Churchill's funeral:
"When his flag-draped coffin moved slowly across the old capital, drawn by naval ratings, and bare-headed Londoners stood trembling in the cold, they mourned, not only him and all he had meant, but all they had been and no longer were, and would never be again."
Manchester, Rosenblatt noted, most likely "had only the scantiest idea where that sentence would end when he began it. Only when he caught up with it could he know. But then, there was another sentence running ahead of him. There was always another sentence. And now there isn't." I still look at Manchester's words ... and Rosenblatt's framing of them ... and feel as if I've been blessed with a combination of master performances so beautiful and perfect that if it they'd been played out in a concert hall, I would have shouted aloud, "Bravo!"
Communication doesn't have to be taught. We learn it instinctively as small children. But the art of language; exploring words and crafting them together with rhythm, poetry, and meaning, is a learned and practiced skill that few ever master as well as Manchester, Keats, Shakespeare, or Safire. Like master chefs, musicians, athletes or scientists, they show us what's possible, and add a layer of nuanced beauty to a sometimes overly practical world.
I didn't always agree with Safire's detailed focus or opinions. And sometimes his immersion in his art may have stood in the way of getting his point across to a broader audience. But maybe, the point he really wanted to get across was simply how much richness there was in this language we use everyday, if only we'd take the time to explore and savor the forest with a little more attention and depth. And on that point, his message was inimitably, powerfully, and exceptionally clear.
Orr: “Sometimes a thing happens. Splits your life. There’s a before and after. I got like five of them at this point.”
This was Frank offering a pep talk to the son of his murdered former henchman Stan in tonight’s episode. (More on this in a moment.) But it’s also a line that captures this season of True Detective so perfectly that it almost seems like a form of subliminal self-critique.
Remember when Ray got shot in episode two and appeared to be dead but came back with a renewed sense of purpose and stopped drinking. No? That’s okay. Neither does the show: It was essentially forgotten after the subsequent episode. Remember when half a dozen (or more) Vinci cops were killed in a bloody shootout along with dozen(s?) of civilians? No? Fine: True Detective’s left that behind, too. Unless I missed it, there was not a single mention of this nationally historic bloodbath tonight.
Educators seldom have enough time to do their business. What’s that doing to the state of learning?
It’s common knowledge that teachers today are stressed, that they feel underappreciated and disrespected, and disillusioned. It’s no wonder they’re ditching the classroom at such high rates—to the point where states from Indiana to Arizona to Kansas are dealing with teacher shortages. Meanwhile, the number of American students who go into teaching is steadily dropping.
A recent survey conducted jointly by the American Federation of Teachers and Badass Teachers Association asked educators about the quality of their worklife, and it got some pretty harrowing feedback. Just 15 percent of the 30,000 respondents, for example, strongly agreed that they’re enthusiastic about the profession. Compare that to the roughly 90 percent percent who strongly agreed that they were enthusiastic about it when they started their career, and it’s clear that something has changed about schools that’s pushing them away. Roughly three in four respondents said they “often” feel stressed by their jobs.
How a radical epilepsy treatment in the early 20th century paved the way for modern-day understandings of perception, consciousness, and the self
In 1939, a group of 10 people between the ages of 10 and 43, all with epilepsy, traveled to the University of Rochester Medical Center, where they would become the first people to undergo a radical new surgery.
The patients were there because they all struggled with violent and uncontrollable seizures. The procedure they were about to have was untested on humans, but they were desperate—none of the standard drug therapies for seizures had worked.
Between February and May of 1939, their surgeon William Van Wagenen, Rochester’s chief of neurosurgery, opened up each patient’s skull and cut through the corpus callosum, the part of the brain that connects the left hemisphere to the right and is responsible for the transfer of information between them. It was a dramatic move: By slicing through the bundle of neurons connecting the two hemispheres, Van Wagenen was cutting the left half of the brain away from the right, halting all communication between the two.
A controversial treatment shows promise, especially for victims of trauma.
It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)
Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.
Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.
Has the Obama administration’s pursuit of new beginnings blinded it to enduring enmities?
“The president said many times he’s willing to step out of the rut of history.” In this way Ben Rhodes of the White House, who over the years has broken new ground in the grandiosity of presidential apologetics, described the courage of Barack Obama in concluding the Joint Comprehensive Plan of Action with the Islamic Republic of Iran, otherwise known as the Iran deal. Once again Rhodes has, perhaps inadvertently, exposed the president’s premises more clearly than the president likes to do. The rut of history: It is a phrase worth pondering. It expresses a deep scorn for the past, a zeal for newness and rupture, an arrogance about old struggles and old accomplishments, a hastiness with inherited precedents and circumstances, a superstition about the magical powers of the present. It expresses also a generational view of history, which, like the view of history in terms of decades and centuries, is one of the shallowest views of all.
Some experts say the normal effects of severe adversity may be misdiagnosed as ADHD.
Dr. Nicole Brown’s quest to understand her misbehaving pediatric patients began with a hunch.
Brown was completing her residency at Johns Hopkins Hospital in Baltimore, when she realized that many of her low-income patients had been diagnosed with attention deficit/hyperactivity disorder (ADHD).
These children lived in households and neighborhoods where violence and relentless stress prevailed. Their parents found them hard to manage and teachers described them as disruptive or inattentive. Brown knew these behaviors as classic symptoms of ADHD, a brain disorder characterized by impulsivity, hyperactivity, and an inability to focus.
When Brown looked closely, though, she saw something else: trauma. Hyper-vigilance and dissociation, for example, could be mistaken for inattention. Impulsivity might be brought on by a stress response in overdrive.
Companies that overvalue alpha-male behavior need to change—both to retain female talent and for the bottom line.
When it comes to gender equality in the workplace, the research on its economic benefits is clear: Equality can boost profits and enhance reputation. And then there’s also the fact that it’s more fair. But the progress of women in the workplace is so far inadequate: Women are woefully underrepresented in executive positions, the pay gap persists, and the motherhood penalty is very real.
Barbara Annis is the founder of the Gender Intelligence Group, a consultancy that works with executives at major firms (including Deloitte, American Express, BMO Financial Group, and eBay) to create strategies to transform their work cultures into ones that are friendly to both men and women.
I recently spoke with Annis about her work and the challenges to achieving gender parity. The following transcript of our conversation has been edited for clarity.
Anti-discrimination statutes are coming into conflict with laws designed to preserve freedom of conscience, especially in the private sector.
Last week, the Equal Employment Opportunity Commission dropped an astounding ruling: By a 3-2 vote, it concluded that “sexual orientation is inherently a ‘sex-based consideration,’ and an allegation of discrimination based on sexual orientation is necessarily an allegation of sex discrimination under Title VII.”
This is a big deal: The Commission’s recommendations shape rulings on federal employees’ workplace-discrimination claims, and its field offices deal with claims made by employees at private organizations, as well. But the ruling is also a reminder of how complicated—and unresolved—the post-Obergefell legal landscape is. The Supreme Court’s ruling in favor of same-sex marriage at the end of June has set the country up for two new waves of discrimination claims: those made by same-sex couples and LGBT workers, and those made by religious Americans who oppose same-sex marriage. The two may seem distinct or even opposed, but they’re actually intertwined: In certain cases, extending new rights to LBGT workers will necessarily lead to religious-freedom objections, and vice versa.
There's an eerie foreshadowing to some of the author's musings from 54 years ago.
Aldous Huxley—author of the classic Brave New World, little-known children's book wordsmith, staple of Carl Sagan's reading list—would have been 118 today. To celebrate his mind and his legacy, here is a rare 1958 conversation with Mike Wallace—the same masterful interviewer who also offered rare glimpses into the minds of Salvador Dalí and Ayn Rand—in which Huxley predicts the "fictional world of horror" depicted in Brave New World is just around the corner for humanity. He explains how overpopulation is among the greatest threats to our freedom, admonishes against the effects of advertising on children, and, more than a century before Occupy Wall Street, outlines how global economic destabilization will incite widespread social unrest.