There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
The presumptive successor to John Boehner abruptly ended his bid after determining he could not get the support he needed from conservatives.
Behind Kevin McCarthy’s stunning decision Thursday to end his bid for speaker lay a simple calculation: Even if he could scrape together the 218 votes he needed to win the formal House election later this month, he would begin his term a crippled leader unable to unite a party that he said was “deeply divided.”
The majority leader and presumed successor to John Boehner had been widely expected to win the House GOP’s secret-ballot nomination on Thursday. All he needed was a simple majority of the 247-member caucus, and he easily had the votes over long-shot challengers Jason Chaffetz of Utah or Daniel Webster of Florida, who won the endorsement of the renegade House Freedom Caucus. But even if he’d won on Thursday, McCarthy knew he was still short of the threshold he needed on the floor, knowing that Democrats would vote as a bloc against him.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.