There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
Delegates in Cleveland answer a nightmare question: Would they take four more years of Barack Obama over a Hillary Clinton presidency?
CLEVELAND—It was a question no Republican here wanted to contemplate.
The query alone elicited winces, scoffs, and more than a couple threats of suicide. “I would choose to shoot myself,” one delegate from Texas replied. “You want cancer or a heart attack?” cracked another from North Carolina.
Hillary Clinton and Barack Obama have each been objects of near histrionic derision from Republicans for years (decades in Clinton’s case), but never more so than during the four days of the GOP’s national convention. Republicans onstage at Quicken Loans Arena and in the dozens of accompanying events have accused President Obama of literally destroying the country in his eight years in the White House. Speakers and delegates subjected Clinton to even harsher rhetoric, charging her with complicity in death and mayhem and then repeatedly chanting, “Lock her up!” from the convention floor.
Biology textbooks tell us that lichens are alliances between two organisms—a fungus and an alga. They are wrong.
In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.
At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.
It’s known as a modern-day hub of progressivism, but its past is one of exclusion.
PORTLAND, Ore.— Victor Pierce has worked on the assembly line of a Daimler Trucks North America plant here since 1994. But he says that in recent years he’s experienced things that seem straight out of another time. White co-workers have challenged him to fights, mounted “hangman’s nooses” around the factory, referred to him as “boy” on a daily basis, sabotaged his work station by hiding his tools, carved swastikas in the bathroom, and written the word “nigger” on walls in the factory, according to allegations filed in a complaint to the Multnomah County Circuit Court in February of 2015.
Pierce is one of six African Americans working in the Portland plant whom the lawyer Mark Morrell is representing in a series of lawsuits against Daimler Trucks North America. The cases have been combined and a trial is scheduled for January of 2017.
Why Millennials aren’t buying cars or houses, and what that means for the economy
In 2009, Ford brought its new supermini, the Fiesta, over from Europe in a brave attempt to attract the attention of young Americans. It passed out 100 of the cars to influential bloggers for a free six-month test-drive, with just one condition: document your experience online, whether you love the Fiesta or hate it.
Young bloggers loved the car. Young drivers? Not so much. After a brief burst of excitement, in which Ford sold more than 90,000 units over 18 months, Fiesta sales plummeted. As of April 2012, they were down 30 percent from 2011.
Don’t blame Ford. The company is trying to solve a puzzle that’s bewildering every automaker in America: How do you sell cars to Millennials (a k a Generation Y)? The fact is, today’s young people simply don’t drive like their predecessors did. In 2010, adults between the ages of 21 and 34 bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985. Miles driven are down, too. Even the proportion of teenagers with a license fell, by 28 percent, between 1998 and 2008.
Fractured by internal conflict and foreign intervention for centuries, Afghanistan made several tentative steps toward modernization in the mid-20th century. In the 1950s and 1960s, some of the biggest strides were made toward a more liberal and westernized lifestyle, while trying to maintain a respect for more conservative factions. Though officially a neutral nation, Afghanistan was courted and influenced by the U.S. and Soviet Union during the Cold War, accepting Soviet machinery and weapons, and U.S. financial aid. This time was a brief, relatively peaceful era, when modern buildings were constructed in Kabul alongside older traditional mud structures, when burqas became optional for a time, and the country appeared to be on a path toward a more open, prosperous society. Progress was halted in the 1970s, as a series of bloody coups, invasions, and civil wars began, continuing to this day, reversing almost all of the steps toward modernization taken in the 50s and 60s. Keep in mind, when looking at these images, that the average life expectancy for Afghans born in 1960 was 31, so the vast majority of those pictured have likely passed on since.
A crop of books by disillusioned physicians reveals a corrosive doctor-patient relationship at the heart of our health-care crisis.
For someone in her 30s, I’ve spent a lot of time in doctors’ offices and hospitals, shivering on exam tables in my open-to-the-front gown, recording my medical history on multiple forms, having enough blood drawn in little glass tubes to satisfy a thirsty vampire. In my early 20s, I contracted a disease that doctors were unable to identify for years—in fact, for about a decade they thought nothing was wrong with me—but that nonetheless led to multiple complications, requiring a succession of surgeries, emergency-room visits, and ultimately (when tests finally showed something was wrong) trips to specialists for MRIs and lots more testing. During the time I was ill and undiagnosed, I was also in and out of the hospital with my mother, who was being treated for metastatic cancer and was admitted twice in her final weeks.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Taking over Stephen Colbert’s Late Show to blast Fox News, the former ‘Daily Show’ host was unapologetically partisan while also seeking to build bridges.
There are so many things that make this election season one without precedent. Why, then, has a faction of late-night punditworld responded with a reversion? Earlier this week, Stephen Colbert resurrected his satirical “Stephen Colbert” character, and then, last night, he invited the retired Jon Stewart to take over his Late Night desk for a classic 10-minute Daily Show rant. The biggest shock: The routines have felt vital and fresh, not mere nostalgia bait or retreads.
The reason for the throwback to golden-years Comedy Central fake news probably lies in politics itself. Stewart’s and Colbert’s original heydays were during the George W. Bush era; their entire personas are based not on indiscriminately satirizing the entire world’s absurdities but rather the particular absurdities of America’s right wing. Under Obama, that meant a certain amount of punching down. Donald Trump’s Republican National Convention, though, offered an even more unvarnished display of popular conservative thinking, attitudes, opinions, and bluster to hold America’s attention than, well, the last RNC. Colbert’s retitled program this week conveyed his glee at the prospect: “The 2016 Trumpublican Donational Conventrump Starring Donald Trump as the Republican Party* *May Contain Traces of Republican.” (His comparatively deflated DNC title: “The 2016 Democratic National Convincing, A Technically Historic Event: Death. Taxes. Hillary.”)
There’s a special “debut” category for vice-presidential selections who very suddenly find themselves in the world’s media glare.
VP picks who had mounted serious runs for president don’t quite fit this category. They already knew what it was like to handle big audiences and the press. For example: the elder George Bush became Ronald Reagan’s VP candidate in 1980, but only after running against Reagan in the primary campaign. The same was true of Joe Biden, who had run against Barack Obama (and Hillary Clinton) for the nomination in 2008 before becoming Obama’s running mate, and had run 20 years earlier too. In electoral politics, Dick Cheney had gotten only as far as Wyoming’s seat in Congress when George W. Bush picked him in 2000. But Cheney was already internationally known as Gerald Ford’s White House chief of staff and George H.W. Bush’s Secretary of Defense during the Gulf War.
Fulfilling what might be the Russian autocrat’s dearest wish, Trump has openly questioned whether the U.S. should keep its commitments to NATO.
The Republican nominee for president, Donald J. Trump, has chosen this week to unmask himself as a de facto agent of Russian President Vladimir Putin, a KGB-trained dictator who seeks to rebuild the Soviet empire by undermining the free nations of Europe, marginalizing NATO, and ending America’s reign as the world’s sole superpower.
I am not suggesting that Donald Trump is employed by Putin—though his campaign manager, Paul Manafort, was for many years on the payroll of the Putin-backed former president of Ukraine, Viktor Yanukovych. I am arguing that Trump’s understanding of America’s role in the world aligns with Russia’s geostrategic interests; that his critique of American democracy is in accord with the Kremlin’s critique of American democracy; and that he shares numerous ideological and dispositional proclivities with Putin—for one thing, an obsession with the sort of “strength” often associated with dictators. Trump is making it clear that, as president, he would allow Russia to advance its hegemonic interests across Europe and the Middle East. His election would immediately trigger a wave of global instability—much worse than anything we are seeing today—because America’s allies understand that Trump would likely dismantle the post-World War II U.S.-created international order. Many of these countries, feeling abandoned, would likely pursue nuclear weapons programs on their own, leading to a nightmare of proliferation.