There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
By announcing the first detection of gravitational waves, scientists have vindicated Einstein and given humans a new way to look at the universe.
More than a billion years ago, in a galaxy that sits more than a billion light-years away, two black holes spiraled together and collided. We can’t see this collision, but we know it happened because, as Albert Einstein predicted a century ago, gravitational waves rippled out from it and traveled across the universe to an ultra-sensitive detector here on Earth.
This discovery, announced today by researchers with the Laser Interferometer Gravitational-wave Observatory (LIGO), marks another triumph for Einstein’s general theory of relativity. And more importantly, it marks the beginning of a new era in the study of the universe: the advent of gravitational-wave astronomy. The universe has just become a much more interesting place.
A maverick investor is buying up water rights. Will he rescue a region, or just end up hurting the poor?
On a brisk, cloudless day last january, Disque Deane Jr. stepped out of his SUV, kicked his cowboy boots in the dirt, and looked around. He had driven two hours from Reno on one of the loneliest stretches of interstate in the United States to visit the Diamond S Ranch, just outside the town of Winnemucca, Nevada. Before him, open fields stretched all the way to the Santa Rosa mountains, 30 miles away. But the land was barren. The fields had been chewed down to the roots by cattle, and the ranch’s equipment had been stripped for parts. A steel trestle bridge lay pitched into the Humboldt River.
Surveying the dilapidated structures and the gopher-riddled soil, Deane saw something few others might: potential. The ranch and an adjoining property, totaling about 11,400 acres—14 times the size of Central Park—were for sale for $10.5 million, and he was thinking about buying them.
The hit new indie release is the opposite of action-packed, yet it’s compelling in its simplicity.
Solitude, it turns out, can be addictive. So I learned playing the new hit indie game Firewatch, where all the action amounts to you, the player, being alone in the woods. You’re a lookout assigned to a summer posting in the Shoshone National Forest of Wyoming in 1989, meaning your job consists of nothing more than wandering around, clearing brush, and calling in any fires you might spot. Most video games equip you with tools and weapons, complex missions, and action sequences. All Firewatch gives you is a map, a compass, and a walkie-talkie—but it’s still one of the most compelling video games I’ve ever played.
It’s the latest in a quiet movement of video games, more psychological products that tap into the atmosphere and wonder of loneliness rather than looking for the simpler thrills the medium usually provides. It’s tempting to trace this trend’s origins back to Minecraft, which launched in 2009 and became a worldwide phenomenon on the back of its extraordinary simplicity. But in Minecraft, you start armed only with your bare hands in a world of monsters, and can eventually upgrade into a city-builder armed with powerful tools. Firewatch is a more intimate affair: a short story, playable over a few hours, that succeeds first and foremost as an emotional experience.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
Once it was because they weren’t as well educated. What’s holding them back now?
Though headway has been made in bringing women’s wages more in line with men’s in the past several decades, that convergence seems to have stalled in more recent years. To help determine why, Francine D. Blau and Lawrence M. Kahn, the authors of a new study from the National Bureau of Economic Research parse data on wages and occupations from 1980 to 2010. They find that as more women attended and graduated college and headed into the working world, education and professional experience levels stopped playing a significant role in the the difference between men and women’s wages. Whatever remains of the discrepancy can’t be explained by women not having basic skills and credentials. So what does explain it?
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
When four American women were murdered during El Salvador’s dirty war, a young U.S. official and his unlikely partner risked their lives to solve the case.
On December 1, 1980, two American Catholic churchwomen—an Ursuline nun and a lay missionary—sat down to dinner with Robert White, the U.S. ambassador to El Salvador. They worked in rural areas ministering to El Salvador’s desperately impoverished peasants, and White admired their commitment and courage. The talk turned to the government’s brutal tactics for fighting the country’s left-wing guerrillas, in a dirty war waged by death squads that dumped bodies in the streets and an army that massacred civilians. The women were alarmed by the incoming Reagan administration’s plans for a closer relationship with the military-led government. Because of a curfew, the women spent the night at the ambassador’s residence. The next day, after breakfast with the ambassador’s wife, they drove to San Salvador’s international airport to pick up two colleagues who were flying back from a conference in Nicaragua. Within hours, all four women would be dead.
Why the Syrian war—and the future of Europe—may hinge on one city
This week, the Syrian army, backed by Russian air strikes and Iranian-supported militias including Hezbollah, launched a major offensive to encircle rebel strongholds in the northern city of Aleppo, choking off one of the last two secure routes connecting the city to Turkey and closing in on the second. This would cut supplies not only to a core of the rebellion against Syrian President Bashar al-Assad, but also to the city’s 300,000 remaining civilians, who may soon find themselves besieged like hundreds of thousands of others in the country. In response, 50,000 civilians have fled Aleppo for the Turkish border, where the border crossing is currently closed. An unnamed U.S. defense official toldThe Daily Beast’s Nancy Youssef that “the war is essentially over” if Assad manages to seize and hold Aleppo.
If Bernie Sanders is serious about a political transformation in America, he needs a better plan.
If there’s one thing that fires up Bernie Sanders supporters—and makes his detractors roll their eyes—it’s his call for a “political revolution.” To his base, it’s the very point of his anti-establishment, anti-elite candidacy. To his critics, it’s the very embodiment of his campaign’s naïve impracticality and vagueness.
But now that voters in Iowa and New Hampshire have spoken, it’s time to take the idea of political revolution more seriously—more seriously, indeed, than Sanders himself appears to have. It’s time to ask: What exactly would it take?
It starts with Congress. And here it’s instructive to compare Sanders and Donald Trump. Both rely on broad, satisfying refrains of “We’re gonna”: We’re gonna break up the big banks. We’re gonna make Mexico build the wall. We’re gonna end the rule of Wall Street billionaires. We’re gonna make China stop ripping us off.