There's been a certain poignancy to Veteran's Day, in recent times, as the very last keepers of exactly what November 11th means close their aging eyes and leave us. At last count this year, there were perhaps five veterans of WWI still living. Add those too young to fight, and there are still only a handful who remember the end of the war and all that era contained and meant.
The living memory of World War II is not quite so close to extinction, but it, too, is slipping away. The youngest WWII veterans ... assuming an age of 17 at the end of the war ... are now 81. There might be one or two who slipped in younger, but if the living memory of the war were a language, it would be classified as "moribund," meaning it had only a few elderly speakers left, according to the UNESCO "Atlas of World Languages in Danger of Disappearing."
We feel the ache and pressure, as time grows short, to try to preserve as much of the wisdom and as many of the memories from those veterans as we can, sensing that when the last of them leave us, we will be bereft of something important; a part of our heritage, story and learning that will leave us the poorer for its loss. There's even a Veterans History Project, organized by the Library of Congress, that's trying to collect as many veterans' stories as possible before time runs out.
Our parents' and great-great-grandparents' memories, after all, tell us not only of the world before our time, but of who we are and where we came from. They give us our pride, our shame, our sense of grounding and roots, and a sense of continuity that is a unique part of our personal narrative and identity. But what about the language those ancestors spoke? Is that an important part of the picture, as well? And does it need to be kept "alive" in the same sense that we want their stories remembered and retold?
It's a relevant question, because experts expect 90% of the world's approximately 7,000 languages will become extinct in the next 100 years as cultures mesh and isolated tribes die out. And the answer may well depend on where you sit when you view the question.
Some in the linguistic community are responding to the accelerating pace of language loss by scrambling to create a language database similar to the Library of Congress's Veterans History Project. Fifty internationally-renowned linguists are gathering at the University of Utah this week to take the first steps in trying to catalogue some of the world's endangered, seriously endangered, or moribund languages before they become extinct. They hope that the databases they help to create (and help direct funding to support) will provide the equivalent of DNA material that can be used to reconstruct languages, with all their cultural clues and connections, even after the last person with a spoken knowledge of them dies.
"The wisdom of humanity is coded in language," says Lyle Campbell, director of the university's Center for American Indian Languages. "Once a language dies, the knowledge dies with it."
But not all linguists agree. In a recent World Affairs article, John McWhorter, a linguist and lecturer in the Department of English and Comparative Literature at Columbia University, asked "would it be inherently evil if there were not 6,000 spoken languages but one? We must consider the question in its pure, logical essence, apart from particular associations with English and its history."
McWhorter's argument, which is long, asserts that while the death of a language is an artistic loss, our attachment to diverse languages itself is a bit perverse, given that he believes they grew up as a function of diverse geographical dispersion of people. Language, he believes, is not inherently linked to culture. And that as a matter of practicality in an increasingly global world, the use and existence of fewer languages is not only less work, in terms of learning and maintenance, but actually an advantage.
More than one aspiring national government, especially in its nascent stages, would have agreed with McWhorter on that last point. But not because language is separate from culture. On the contrary, efforts to stamp out regional languages and instill one, unified national language are undertaken because language is so inextricable and central to culture. So just as regional or tribal languages are seen as a threat to national loyalty and identity, a national language doesn't just make trade and communication easier. It also helps build another, unified, "national" identity, instead.
Unfortunately, that strategy doesn't always work. Or, at least, not without a cost. Pamela Serota Cote, whose doctoral research at the University of San Francisco focused on Breton language and identity, argues that looking at language as only a practical tool or as an outside connaisseur, as McWhorter does, misses the central importance of language to personal narrative and identity.
"We understand things, events, ourselves and others through a process of interpretation, which occurs in language," she argues. "The diversity of our languages represents the richness of our expressiveness of Being. This is how language, culture and identity intersect; it is also why the loss of a language is such a concern and why minority language rights is such an emotionally charged issue in countries around the world. Because language discloses cultural and historical meaning, the loss of language is a loss of that link to the past. Without a link to the past, people in a culture lose a sense of place, purpose and path; one must know where one came from to know where one is going. The loss of language undermines a people's sense of identity and belonging, which uproots the entire community in the end. Yes, they may become incorporated into the dominant language and culture that has subsumed them, but they have lost their heritage along the way."
If the last living members of a community or culture who speak a particular dialect or language die, there are no descendants to be uprooted, of course. And, perhaps, there is nothing to be done about that. Serota Cote acknowledges for a language to be revived, there has to be a population left to learn it, and a strong desire among the young people to revive that connection with their heritage.
But in Brittany, which was gathered into France only after the Revolution, the language became endangered not because of low population numbers, but because national edicts mandated that French be the only language spoken or learned. Finally, in the late 1970s, a movement sprung up to revive the Breton language, which bears far more resemblance to the tongue of Brittany's Celtic settlers than French. Language immersion schools now teach the language to children wishing to learn Breton as well as French, and other cultural revival efforts in Breton music and dance have accompanied the language movement.
The result has been remarkable, even though only a tiny percentage of Bretons actually go through the language schools. The Bretons have not revolted against French rule. But the shame at being Breton has receded, much as the African-American "Roots" movement reduced the shame at being black by offering a narrative story and pride that the children of subsumed slaves had lacked. A high rate of alcoholism and depression has receded and, as Serota Cote observed, "every Breton I spoke with who has learned the language as an adult said they feel now that they have been able to close the gap and heal those past wounds of shame. Many described finally discovering their roots by learning the language. One Breton said that the language 'completes the whole.'"
The challenge of melding and balancing past and present; tribal roots and unified national identity is one many nations struggle with. Too much tribal loyalty can breed division, but too much focus on an unified whole can destroy not only colors in the cultural fabric of a country, but an important sense of identity and narrative continuity among its diverse citizens. And language, like family or cultural memories, can play an important role in that narrative.
Sometimes language dies because an entire population dies out. That's still a loss, just as every plant and animal that becomes extinct is a loss to the richness of the planet's tapestry of existence. But in cases where the language wanes not because of physical extinction, but because of cultural subsumption, the loss of a language is a far more personal tragedy ... at least to those within that culture. For someone inside a lost or dying culture, a language can be like the memories of our grandparents--not required, or even convenient, for efficiency of operation in a modern, globalized world, but essential for our sense of roots, security, identity, pride, continuity and wholeness.
Life moves on. World War I is a distant memory, even for the elderly. Many Americans don't even know the real origins of "Veteran's Day." But imagine, for a moment, if we'd lost more than just the memory of the day's origins. Imagine if, along with losing those who remembered the world when Armistice Day was first celebrated, or even what the experience of WWII meant, we were also losing the language through which those memories had been lived and recorded. Chances are that any arguments about the accidental origins of that language, or its obscure use in the commercial world, would suddenly seem far less important to us than keeping that link with our heritage and past alive. No matter what anyone on the outside thought.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Footnotes. Numbers. Detailed proposals. The Donald’s economic address at an aluminum factory in Pennsylvania had it all.
Donald Trump must have hired some researchers.
The famously off-the-cuff orator delivered a surprisingly specific speech on trade, making seven detailed policy pledges while predicting that Hillary Clinton, if elected, would tweak and then sign the enormous Pacific trade pact she now opposes as a candidate for president.
Trump’s address to workers at a Pennsylvania aluminum factory continued his recent effort to lift both the tone and substance of his speeches. But it marked an even bigger departure in its sheer wonkiness.First, his campaign sent out the prepared remarks with 128 footnotes. And in delivering the speech from a teleprompter, Trump delved into such granular policy detail that he referenced specific sections of decades-old trade laws and vowed to invoke “Article 2205” of the North American Free Trade Agreement. Doing so, he said, would withdraw the U.S. from NAFTA if its trading partners don’t agree to renegotiate the Clinton-era accord.
Fears of civilization-wide idleness are based too much on the downsides of being unemployed in a society premised on the concept of employment.
People have speculated for centuries about a future without work, and today is no different, with academics, writers, and activists once again warning that technology is replacing human workers. Some imagine that the coming work-free world will be defined by inequality: A few wealthy people will own all the capital, and the masses will struggle in an impoverished wasteland.
A different, less paranoid, and not mutually exclusive prediction holds that the future will be a wasteland of a different sort, one characterized by purposelessness: Without jobs to give their lives meaning, people will simply become lazy and depressed. Indeed, today’s unemployed don’t seem to be having a great time. One Gallup poll found that 20 percent of Americans who have been unemployed for at least a year report having depression, double the rate for working Americans. Also, some research suggests that the explanation for rising rates of mortality, mental-health problems, and addiction among poorly-educated, middle-aged people is a shortage of well-paid jobs. Another study shows that people are often happier at work than in their free time. Perhaps this is why many worry about the agonizing dullness of a jobless future.
Their degrees may help them secure entry-level jobs, but to advance in their careers, they’ll need much more than technical skills.
American undergraduates are flocking to business programs, and finding plenty of entry-level opportunities. But when businesses go hunting for CEOs or managers, “they will say, a couple of decades out, that I’m looking for a liberal arts grad,” said Judy Samuelson, executive director of the Aspen Institute’s Business and Society Program.
That presents a growing challenge to colleges and universities. Students are clamoring for degrees that will help them secure jobs in a shifting economy, but to succeed in the long term, they’ll require an education that allows them to grow, adapt, and contribute as citizens—and to build successful careers. And it’s why many schools are shaking up their curricula to ensure that undergraduate business majors receive something they may not even know they need—a rigorous liberal-arts education.
It’s the cloudless map’s first major makeover since 2013.
More than 1 billion people use Google Maps every month, making it possibly the most popular atlas ever created. On Monday, it gets a makeover, and its many users will see something different when they examine the planet’s forests, fields, seas, and cities.
Google has added 700 trillion pixels of new data to its service. The new map, which activates this week for all users of Google Maps and Google Earth, consists of orbital imagery that is newer, more detailed, and of higher contrast than the previous version.
Most importantly, this new map contains fewer clouds than before—only the second time Google has unveiled a “cloudless” map. Google had not updated its low- and medium-resolution satellite map in three years.
At least 36 people were killed in an attack Tuesday at Ataturk airport, one of the busiest in Europe.
Here’s what we know:
—Explosions and gunfire were reported Tuesday night at Istanbul’s Ataturk International Airport, one of the busiest in Europe. Turkey’s prime minister, Binali Yildirim, said three attackers opened fire at the airport’s international terminal and detonated explosives, blowing themselves up. Officials suspect the Islamic State was behind the attack.
—At least 36 people were killed and 147 wounded, the prime minister said. Photos from the scene showed bloodied bodies and debris on the pavement outside the terminal.
—We’re live-blogging what’s happening, and you can read how it unfolded below. All updates are in Eastern Standard Time (GMT -5).
Witnesses described the attack and the chaos that followed to reporters in Istanbul. From the AP:
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The 18th-century ailment was on the brink of elimination before budget cuts helped resurrect it.
In recent months, newspapers around the country have published stories that sound like they could have been written 100 years ago. Indiana’s syphilis cases skyrocketed by 70 percent in a single year. Texas’ Lubbock county was under a “syphilis alert.” Various counties face shortages of the medication used to treat syphilitic pregnant women.
But the headlines are very much modern—and urgent. Syphilis is back, public-health experts say.
For many years, syphilis was considered a practically ancient ailment—a “Great Pox” that, like tuberculosis or polio, Americans just don’t get anymore. There were just 6,000 cases of primary and secondary syphilis in 2000, and the CDC briefly thought the disease’s total elimination was within reach.
The way members of the ‘model minority’ are treated in elite-college admissions could affect race-based standards moving forward.
In his new book, Earning Admission: Real Strategies for Getting Into Highly Selective Colleges, the strategist Greg Kaplan urges Asians not to identify as such on their applications. “Your child should decline to state her background if she identifies with a group that is overrepresented on campus even if her name suggests affiliation,” he advises parents, also referencing Jews. Such tips are increasingly common in the college-advising world; it’s not unusual for consultants, according to The Boston Globe, to urge students to “deemphasize the Asianness” in their resumes or avoid writing application essays about their immigrant parents “coming from Vietnam with $2 in a rickety boat and swimming away from sharks.”
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.