There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
A case study in how a story about substance turned into to just another incident of White House infighting
“DO NOT CONGRATULATE.”
That was the instruction that President Donald Trump received on briefing materials before he called Russian President Vladimir Putin on Tuesday to discuss Putin’s victory in a reelection widely regarded as corrupt.
But Trump did congratulate Putin, and he also declined to bring up the recent poisoning of an ex-Russian spy and his daughter in London, a crime that the British government blames on the Kremlin. As I wrote on Tuesday, Trump’s reaction was somewhat out of the mainstream of American reaction when autocratic rulers win election, but not entirely apart. Barack Obama called Putin following his 2012 election victory, but waited several days before doing so, while the U.S. government criticized election regularities.
In his new book, Steven Pinker is curiously blind to the power and benefits of small-town values.
I’m a scientist at UC Berkeley—a card-carrying true believer in liberal Enlightenment values. Imagine that I meet a bright young woman in a small town in Wisconsin or Alabama, and that I want to persuade her to become a scientist like me. “Listen, science is really great!,” I say. “We scientists care about truth and reason and human flourishing. We include people from every country and culture. And our values have transformed the world. For thousands of years before the Enlightenment, the speed limit was the pace of a fast horse, and children died all the time. Now ideas move at the speed of light, and a child’s death is an unthinkable tragedy. Democracy has eclipsed tyranny, prosperity has outpaced poverty, medicine has routed illness, individual liberation has uprooted social convention. Come join us!”
Mark Zuckerberg might believe the world is better without privacy. He’s wrong.
It will be fantastically satisfying to see the boy genius flayed. All the politicians—ironically, in search of a viral moment—will lash Mark Zuckerberg from across the hearing room. They will corner Facebook’s founding bro, seeking to pin all manner of sin on him. This will make for scrumptious spectacle, but spectacle is a vacuous substitute for policy.
As Facebook’s scandals have unfolded, the backlash against Big Tech has accelerated at a dizzying pace. Anger, however, has outpaced thinking. The most fully drawn and enthusiastically backed proposal now circulating through Congress would regulate political ads that can appear on the platform, a law that hardly curbs the company’s power or profits. And, it should be said, a law that does nothing to attack the core of the problem: the absence of governmental protections for personal data.
They’re both blamed for predisposing their members to violent acts, but they’ve sparked radically different public-policy responses.
When I thought about locking up with a crew in 1996, I wanted to see a full initiation first, not parts I stumbled upon over the years. My friend Cliff and I arrived at a park not close from my home in Jamaica, Queens. Leaves danced with the wind around our feet, wafting an eerie feeling in my 14-year-old black body. The grounds of the initiation beckoned: a high-rise chain link fence, enclosing two basketball courts.
Through the daylighted chain, I watched scowls and punches and stomps engulf the uninitiated teen—a stoppage, then an awkward transition into hugs, handshakes, and smiles. The striking contrast shot at my core of authenticity, the insincerity of the punch-hug, of the stomp-smile, murdering my thoughts of joining a crew.
I asked the guy who wrote the textbook about them.
Wednesday is the first full day of spring in the Northern Hemisphere, but you wouldn’t know it on the U.S. East Coast. A huge, ponderous snowstorm is lurching its way up the Atlantic seaboard, dumping snow from D.C. to Boston.
More than two inches per hour are falling in some places. At Washington’s Reagan National Airport, it hasn’t snowed this much, this late in the season, in more than 50 years. And the storm will not be a passing event: The entire system will meander up the northeast, snowing all the while, for almost two days straight.
The lonely poverty of America’s white working class
For the last several months, social scientists have been debating the striking findings of a study by the economists Anne Case and Angus Deaton.* Between 1998 and 2013, Case and Deaton argue, white Americans across multiple age groups experienced large spikes in suicide and fatalities related to alcohol and drug abuse—spikes that were so large that, for whites aged 45 to 54, they overwhelmed the dependable modern trend of steadily improving life expectancy. While critics have challenged the magnitude and timing of the rise in middle-age deaths (particularly for men), they and the study’s authors alike seem to agree on some basic points: Problems of mental health and addiction have taken a terrible toll on whites in America—though seemingly not in other wealthy nations—and the least educated among them have fared the worst.
How evangelicals, once culturally confident, became an anxious minority seeking political protection from the least traditionally religious president in living memory
One of the most extraordinary things about our current politics—really, one of the most extraordinary developments of recent political history—is the loyal adherence of religious conservatives to Donald Trump. The president won four-fifths of the votes of white evangelical Christians. This was a higher level of support than either Ronald Reagan or George W. Bush, an outspoken evangelical himself, ever received.
Trump’s background and beliefs could hardly be more incompatible with traditional Christian models of life and leadership. Trump’s past political stances (he once supported the right to partial-birth abortion), his character (he has bragged about sexually assaulting women), and even his language (he introduced the words pussy and shithole into presidential discourse) would more naturally lead religious conservatives toward exorcism than alliance. This is a man who has cruelly publicized his infidelities, made disturbing sexual comments about his elder daughter, and boasted about the size of his penis on the debate stage. His lawyer reportedly arranged a $130,000 payment to a porn star to dissuade her from disclosing an alleged affair. Yet religious conservatives who once blanched at PG-13 public standards now yawn at such NC-17 maneuvers. We are a long way from The Book of Virtues.
Photos of the long, painstaking construction process of the $8 billion James Webb Space Telescope, set to launch in early 2019.
Assembling the world’s most powerful space telescope is a complicated process, and Chris Gunn has been there from nearly the beginning. Gunn, a NASA photographer, has spent almost a decade photographing the James Webb Space Telescope, the successor to the famed Hubble, capturing its transformation from a bare metal framework into a gleaming science observatory with 18 gold-plated mirrors. “For me, a science-fiction buff, it’s almost like seeing the Enterprise being built,” Gunn says. NASA has Gunn capture nearly step in the process for the space agency’s own records—“every single wrench turn, every single movement is documented,” he said. Some photos are never disclosed because they feature proprietary technology. Others, after thorough approval from project managers, are released to the public to spark interest and awe at the ambitious (and expensive) project. Soon, it’ll be Webb’s turn to take pictures. In 2019, the telescope will launch to a spot about one million miles from Earth and settle into an orbit around the sun. Webb, seeing the cosmos in infrared wavelengths, will photograph the most distant stars and galaxies in the universe. When that happens, Gunn says, “I really want people to want to know what the observatory looked like and how it was built and about the people who built it.”
A wedding is no longer the first step into adulthood that it once was, but, often, the last.
The decline of marriage is upon us. Or, at least, that’s what the zeitgeist would have us believe. In 2010, when Time magazine and the Pew Research Center famously asked Americans whether they thought marriage was becoming obsolete, 39 percent said yes. That was up from 28 percent when Time asked the question in 1978. Also, since 2010, the Census Bureau has reported that married couples have made up less than half of all households; in 1950 they made up 78 percent. Data such as these have led to much collective handwringing about the fate of the embattled institution.
But there is one statistical tidbit that flies in the face of this conventional wisdom: A clear majority of same-sex couples who are living together are now married. Same-sex marriage was illegal in every state until Massachusetts legalized it in 2004, and it did not become legal nationwide until the Supreme Court decision Obergefell v. Hodges in 2015. Two years after that decision, 61 percent of same-sex couples who were sharing a household were married, according to a set of surveys by Gallup. That’s a high take-up rate: Just because same-sex couples are able to marry doesn’t mean that they have to; and yet large numbers have seized the opportunity. (That’s compared with 89 percent of different-sex couples.)