There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
After more than a year of rumors and speculation, Bruce Jenner publicly came out as transgender with four simple words: “I am a woman.”
“My brain is much more female than male,” he explained to Diane Sawyer, who conducted a primetime interview with Jenner on ABC Friday night. (Jenner indicated he prefers to be addressed with male pronouns at this time.) During the two-hour program, Jenner discussed his personal struggle with gender dysphoria and personal identity, how it shaped his past and current relationships and marriages, and how he finally told his family about his true gender identity.
The show went to impressive lengths to explain unfamiliar concepts of gender and sexuality to its audience, although it didn't always go smoothly. Sawyer’s questions occasionally came off as awkward and tone-deaf, mirroring a broader lack of understanding by many Americans about the difficulties that trans people face. But Sawyer’s empathy also shone when explaining concepts like gender identity and transitioning to her audience—a rare experience on primetime American television. It was a powerful signal of how much progress the LGBT movement has made over the past twenty years, even though the T in that acronym still lags behind the other three letters in both social acceptance and legal protections, and in how much progress remains to be made.
In her new book No One Understands You and What To Do About It, Heidi Grant Halvorson tells readers a story about her friend, Tim. When Tim started a new job as a manager, one of his top priorities was communicating to his team that he valued each member’s input. So at team meetings, as each member spoke up about whatever project they were working on, Tim made sure he put on his “active-listening face” to signal that he cared about what each person was saying.
But after meeting with him a few times, Tim’s team got a very different message from the one he intended to send. “After a few weeks of meetings,” Halvorson explains, “one team member finally summoned up the courage to ask him the question that had been on everyone’s mind.” That question was: “Tim, are you angry with us right now?” When Tim explained that he wasn’t at all angry—that he was just putting on his “active-listening face”—his colleague gently explained that his active-listening face looked a lot like his angry face.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
New Zealand's largest newspaper is deeply conflicted. With the World Cup underway in Brazil, should The New Zealand Herald refer to the "global round-ball game" as "soccer" or "football"? The question has been put to readers, and the readers have spoken. It's "football"—by a wide margin.
We in the U.S., of course, would disagree. And now we have a clearer understanding of why. In May, Stefan Szymanski, a sports economist at the University of Michigan, published a paper debunking the notion that "soccer" is a semantically bizarre American invention. In fact, it's a British import. And the Brits used it often—until, that is, it became too much of an Americanism for British English to bear.
The story begins, like many good stories do, in a pub. As early as the Middle Ages, Szymanski explains, the rough outlines of soccer—a game, a ball, feet—appear to have been present in England. But it wasn't until the sport became popular among aristocratic boys at schools like Eton and Rugby in the nineteenth century that these young men tried to standardize play. On a Monday evening in October 1863, the leaders of a dozen clubs met at the Freemasons' Tavern in London to establish "a definite code of rules for the regulation of the game.” They did just that, forming the Football Association. The most divisive issue was whether to permit "hacking," or kicking an opponent in the leg (the answer, ultimately, was 'no').
The editors of Smithsonian magazine have announced the winners of their 12th annual photo contest, selected from more than 26,500 entries. The winning photographs from from the competition's six categories are published below: The Natural World, Travel, People, Americana, Altered Images and Mobile. Also, a few finalists have been included as well. Captions were written by the photographers. Be sure to visit the contest page at Smithsonian.com to see all the winners and finalists.
Mary Hamm was in pain, though it was hard to tell. She bustled around the Starbucks, pouring drinks, restocking pastries, and greeting customers with an unshakable gaze perfected during 25 years of working in hospitality. Her smile said, How can I help you? Her eyes said, I know you’re going to order a caramel Frappuccino, so let’s do this.
Occupying prime space in a Fredericksburg, Virginia, strip mall, beside a Dixie Bones BBQ Post, this Starbucks pulls in about $40,000 a week. Hamm, 49, had been managing Starbucks stores for 12 years. The problem was her feet. After two decades in the food-service business, they had started to wear out. She had two metal plates in the right one, installed over the course of five surgeries. Now her left foot needed surgery too. She doesn’t like to complain, but when I asked her how often she was in pain, she smiled and said quietly, “All the time.”
Today was the latest installment of the never-ending Clinton scandal saga, but it won’t be the last. Yet in some ways, the specifics are a distraction. The sale of access was designed into the post-2001 Clinton family finances from the start. Probably nobody will ever prove that this quid led to that quo … but there’s about a quarter-billion-dollar of quid heaped in plain sight and an equally impressive pile of quo, and it’s all been visible for years to anyone who cared to notice. As Jonathan Chait, who is no right-wing noise-machine operator, complained: “The Clintons have been disorganized and greedy.”
“All of this amounts to diddly-squat,” pronounced long-time Clinton associate James Carville when news broke that Hillary Clinton had erased huge numbers of emails. That may not be true: If any of the conduct in question proves illegal, destroying relevant records may also have run afoul of the law.
Leon Trotsky is not often invoked as a management guru, but a line frequently attributed to him would surely resonate with many business leaders today. “You may not be interested in war,” the Bolshevik revolutionary is said to have warned, “but war is interested in you.” War, or at least geopolitics, is figuring more and more prominently in the thinking and fortunes of large businesses.
Of course, multinational companies such as Shell and GE have long cultivated an expertise in geopolitics. But the intensity of concern over global instability is much higher now than in any recent period. In 2013, the private-equity colossus KKR named the retired general and CIA director David Petraeus as the chairman of its global institute, which informs the firm’s investment decisions. Earlier this year, Sir John Sawers, the former head of MI6, Britain’s CIA, became the chairman of Macro Advisory Partners, a firm that advises businesses and governments on geopolitics. Both appointments are high-profile examples of a much wider trend: an increasing number of corporations are hiring political scientists, starting their board meetings with geopolitical briefings, and seeking the advice of former diplomats, spymasters, and military leaders.“The last three years have definitely been a wake-up call for business on geopolitics,” Dominic Barton, the managing director of McKinsey, told me. “I’ve not seen anything like it. Since the Second World War, I don’t think you’ve seen such volatility.” Most businesses haven’t pulled back meaningfully from globalized operation, Barton said. “But they are thinking, Gosh, what’s next?”
When healthcare is at its best, hospitals are four-star hotels, and nurses, personal butlers at the ready—at least, that’s how many hospitals seem to interpret a government mandate.
When Department of Health and Human Services administrators decided to base 30 percent of hospitals’ Medicare reimbursement on patient satisfaction survey scores, they likely figured that transparency and accountability would improve healthcare. The Centers for Medicare and Medicaid Services (CMS) officials wrote, rather reasonably, “Delivery of high-quality, patient-centered care requires us to carefully consider the patient’s experience in the hospital inpatient setting.” They probably had no idea that their methods could end up indirectly harming patients.
“I think it’s just gonna be us.” The voice came from one of the grizzled sports reporters gathered in a nearly empty Manhattan hotel ballroom on a cold Friday evening in February. There was the reporter from NBA.com, the house organ of the National Basketball Association; there was the travel/sports/entertainment writer for the Queens Chronicle; and there was a guy who had a lot to say about the New Jersey Devils.
We were there, the four of us, for a mid-season press conference organized by the National Basketball Players Association to coincide with the league’s All-Star festivities. Professional sports unions, dedicated as they are to the cause of helping millionaire athletes make more money, have never been popular, but the nearly empty ballroom felt especially grim relative to the weekend’s other attractions. The All-Star Celebrity Game—pitting the 5-foot-4-inch comedian Kevin Hart against, among others, Mo’ne Davis, the 13-year-old girl who starred in last year’s Little League World Series—would tip off at Madison Square Garden later that night. That, at least, had news potential.