Most countries have always had multiple sets of values, and which take dominance depend on complex interactions of economic, political, and social interests
Washington Post columnist Richard Cohen, reacting to Volkswagen's impending emergence as the world's largest automobile manufacturer, contrasts the rebirth of Germany and Japan after the Second World War with the struggles of Egypt and Libya to establish political and economic stability after the overthrow of their own tyrannical regimes. The key to everything is culture, he says:
Don't ask me to define the term, but it is something within us individually and something collectively within a nation or people. It is about all that Japan and Germany were left with -- no oil or gas, that's for sure. It explains why Germany, dismembered in a vast and horrendous population exchange, and the eastern sector of it mismanaged for years afterward by knuckleheaded communists, is now Europe's preeminent economic power. Germany may no longer be uber alles, but it's definitely uber quite a bit.
The problem with "culture" as an explanation is that most nations have always had multiple sets of values, which come and go in leadership. In the period of my original specialty, early 19th century Germany, Prussia, Bavaria, and free cities like Hamburg had very distinctive societies and institutions, and the German stereotype in the West was more likely to be the dreamy artist or poet than the military officer. It was a complex interaction of economic, political, and social interests and institutions that produced the Bismarckian Reich. Japanese culture also had multiple strands, immortalized in Ruth Benedict's best-selling Chrysanthemum and the Sword, a wartime analysis of the enemy that later helped shape Japanese culture itself, just as Tocqueville has influenced America's self-understanding. (So much for the uselessness of anthropology.)
Mr. Cohen omits an outstanding and obvious circumstance that separates Germany and Japan from Egypt and Libya -- apart from the fact that neither Axis nation was totally destroyed by wartime bombing but had retained immense reserves of machinery and technological skill. (Yes, the Soviets dismantled factories in their zone, but the U.S. helped move key personnel, including senior staff of Zeiss, to the West.) Bolstering the German and Japanese economies was also essential both for U.S. trade interests and for the strategic containment of Soviet communism. Meanwhile the new state of Israel was self-consciously challenging the stereotypes of Jewish culture as immigration of Jews from Islamic-majority lands helped shape Israeli culture. And in the postwar years, British industrial culture, once the envy of the world (and an underrated engine of Allied victory, as the historian of technology David Edgerton has just shown in Britain's War Machine) was beginning to unravel.
Is cultural change a butterfly-effect phenomenon that will always resist social science modeling, or are there patterns that have so far eluded us? I can't say, but meanwhile I don't think it's helpful to explain anything with a concept that resists definition.
Edward Tenner is a historian of technology and culture, and an affiliate of the Center for Arts and Cultural Policy at Princeton's Woodrow Wilson School. He was a founding advisor of Smithsonian's Lemelson Center.
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.
When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.
When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
I bought into the St. Ives lie for years. In the already insecure times of high school and college, my skin was host to constant colonies of acne, my nose peppered with blackheads, my chin and forehead a topographical horror of cystic zits that lasted for weeks. But as I moved into adulthood, it didn’t go away, making me, I suppose, part of a trend—adult acne is on the rise, particularly among women.
I’m sure it never really seemed so bad to others as it did to me, as is the way with these things. I covered it up with layers of gloppy foundation, then with more proficiently applied makeup later on, then went on hormonal birth control, which improved the situation significantly.
But for many of the years in-between, I washed my face with St. Ives Apricot Scrub, which is an exfoliator made with granules of walnut shell powder. It is extremely rough. Perhaps too rough. We’ll find out: Kaylee Browning and Sarah Basile recently filed a class-action lawsuit against St. Ives’s maker, Unilever, alleging that the wash “leads to long-term skin damage” and “is not fit to be sold as a facial scrub.”
Can governments be as innovative about saving lives?
Yesterday’s terrorist attack that struck at the end of an Ariana Grande concert in Britain’s Manchester Arena—leaving 22 people dead and 59 injured, by the latest count—feels perhaps even more callous and personal than other such recent atrocities. As TheNew York Timesnoted, the target was “a concert spilling over with girls in their teens or younger, with their lives ahead of them, out for a fun night.”
For Europe, the attack, now claimed by ISIS, represents a continuation of a nightmare scenario: The pace and deadliness of terrorist attacks in the continent has reached levels unprecedented in the post-9/11 era, with the heinous and grotesque becoming frighteningly routine.
Even five years ago, specialists could count the major post-9/11 attacks in Western countries on one hand, and knew every date on which they had been perpetrated. They were known by names like 3/11 or 7/7 (references to attacks in Madrid and London, respectively).
The story was notably loud. Its retraction is notably quiet.
On Tuesday of last week, the day after TheWashington Post published its bombshell about President Trump’s Oval Office divulgences to Sergey Lavrov and Sergei Kisliyak, Sean Hannity took to the air at the Fox News Channel to discuss a murdered man named Seth Rich. Rich, a 27-year-old staffer at the Democratic National Committee, had been gunned down in Washington, DC, in July, seemingly the victim of a violent crime. Earlier that day, however, a local Fox TV station had reported—in a claim that would quickly be debunked—that Rich had ties to WikiLeaks, and that his death was, rather than the tragic result of random violence, instead evidence of a deeper conspiracy.
In the days since, that idea has leapt to life in the conservative areas of the media—an easy symbol, in the minds of many, of the “mainstream” media’s stubborn and partisan refusal to report on a story that would put the DNC in a negative light. (“Silence from Establishment Media over Seth Rich WikiLeaks Report,” Breitbartseethed.) And so, as many members of the nation’s press corps set out to further the Post’s reporting on the White House, the Rich story became a chorus-like feature on conservative-leaning media—and not just in Hannity’s extra-bombastic corner of Fox News. The Rich story hit Drudge. It exploded on social media. “NOT RUSSIA, BUT AN INSIDE JOB?” Breitbart asked, provocatively. The site added that, “if proven, the report has the potential to be one of the biggest cover-ups in American political history, dispelling the widespread claim that the Russians were behind hacks on the DNC.”
Reports that presidential aides asked senior intelligence officials to help shut down the FBI investigation put those staffers in legal jeopardy.
The Washington Postreport that White House staffers were involved in President Trump’s alleged effort to shut down the FBI’s investigation into ousted National Security Adviser Michael Flynn increases the legal and political peril for the administration as Robert Mueller’s inquiry moves forward.
On Monday, the Post reported that Trump had asked Director of National Intelligence Dan Coats and National Security Director Mike Rogers to push back on the testimony of the March then-FBI Director Jim Comey that Trump campaign associates were being scrutinized as a part of the investigation into Russian interference with the 2016 election. Both officials reportedly refused.
“This is very close to what Nixon tried to do in drawing in the CIA to short circuit the FBI investigation during Watergate,” said a former high-ranking Justice Department official. “His advisers could be very much at risk if they played a role in the alleged interference.” The Post did not mention whether Trump-appointed CIA Director Mike Pompeo received a similar request.