Most countries have always had multiple sets of values, and which take dominance depend on complex interactions of economic, political, and social interests
Washington Post columnist Richard Cohen, reacting to Volkswagen's impending emergence as the world's largest automobile manufacturer, contrasts the rebirth of Germany and Japan after the Second World War with the struggles of Egypt and Libya to establish political and economic stability after the overthrow of their own tyrannical regimes. The key to everything is culture, he says:
Don't ask me to define the term, but it is something within us individually and something collectively within a nation or people. It is about all that Japan and Germany were left with -- no oil or gas, that's for sure. It explains why Germany, dismembered in a vast and horrendous population exchange, and the eastern sector of it mismanaged for years afterward by knuckleheaded communists, is now Europe's preeminent economic power. Germany may no longer be uber alles, but it's definitely uber quite a bit.
The problem with "culture" as an explanation is that most nations have always had multiple sets of values, which come and go in leadership. In the period of my original specialty, early 19th century Germany, Prussia, Bavaria, and free cities like Hamburg had very distinctive societies and institutions, and the German stereotype in the West was more likely to be the dreamy artist or poet than the military officer. It was a complex interaction of economic, political, and social interests and institutions that produced the Bismarckian Reich. Japanese culture also had multiple strands, immortalized in Ruth Benedict's best-selling Chrysanthemum and the Sword, a wartime analysis of the enemy that later helped shape Japanese culture itself, just as Tocqueville has influenced America's self-understanding. (So much for the uselessness of anthropology.)
Mr. Cohen omits an outstanding and obvious circumstance that separates Germany and Japan from Egypt and Libya -- apart from the fact that neither Axis nation was totally destroyed by wartime bombing but had retained immense reserves of machinery and technological skill. (Yes, the Soviets dismantled factories in their zone, but the U.S. helped move key personnel, including senior staff of Zeiss, to the West.) Bolstering the German and Japanese economies was also essential both for U.S. trade interests and for the strategic containment of Soviet communism. Meanwhile the new state of Israel was self-consciously challenging the stereotypes of Jewish culture as immigration of Jews from Islamic-majority lands helped shape Israeli culture. And in the postwar years, British industrial culture, once the envy of the world (and an underrated engine of Allied victory, as the historian of technology David Edgerton has just shown in Britain's War Machine) was beginning to unravel.
Is cultural change a butterfly-effect phenomenon that will always resist social science modeling, or are there patterns that have so far eluded us? I can't say, but meanwhile I don't think it's helpful to explain anything with a concept that resists definition.
Edward Tenner is a historian of technology and culture, and an affiliate of the Center for Arts and Cultural Policy at Princeton's Woodrow Wilson School. He was a founding advisor of Smithsonian's Lemelson Center.
The comedian's n-bomb at the White House Correspondents’ Dinner highlights a generational shift in black culture.
Georgia McDowell was born the daughter of farmers and teachers in North Carolina in 1902. She was my great-grandmother, and she taught me to read, despite the dementia that clouded her mind and the dyslexia that interrupted mine. I loved Miss Georgia, though she kept as many hard lines in her home as she had in her classrooms. One of the hardest lines was common to many black households: The word “nigger” and all of its derivatives were strict taboos in person, on television, and on radio from any source, black or otherwise, so long as she lived and breathed. She’d kept the taboo through decades of teaching black students and raising black children. For most of my childhood, the taboo was absolute.
When Apple announced in 2013 that its next iPhone would include a fingerprint reader, it touted the feature as a leap forward in security. Many people don’t set up a passcode on their phones, Apple SVP Phil Schiller said at the keynote event where the Touch ID sensor was unveiled, but making security easier and faster might convince more users to protect their phones. (Of course, Apple wasn’t the first to stuff a fingerprint reader into a flagship smartphone, but the iPhone’s Touch ID took the feature mainstream.)
The system itself proved quite secure—scanned fingerprints are stored, encrypted, and processed locally rather than being sent to Apple for verification—but the widespread use of fingerprint data to unlock iPhones worried some experts. One of the biggest questions that hung over the transition was legal rather than technical: How might a fingerprint-secured iPhone be treated in a court of law?
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
For some, abandoning expensive urban centers would be a huge financial relief.
Neal Gabler has been a formative writer for me: His Winchell: Gossip, Power, and the Culture of Celebrity was one of the books that led me to think about leaving scholarship behind and write nonfiction instead, and Walt Disney: The Triumph of the American Imagination was the first book I reviewed as a freelance writer. To me, he exemplifies the best mix of intensive archival research and narrative kick.
So reading his recent essay, "The Secret Shame of Middle-Class Americans," was a gut punch: First, I learned about a role model of mine whose talent, in my opinion, should preclude him from financial woes. And, then, I was socked by narcissistic outrage: I, too, struggle with money! I, too, am a failing middle-class American! I, too, am a writer of nonfiction who should be better compensated!
The billionaire’s bid for the nomination was opposed by many insiders—but his success reveals the ascendance of other elements of the party coalition.
In The Party Decides, an influential book about how presidential nominees are selected, political scientists John Zaller, Hans Noel, David Karol, and Marty Cohen argue that despite reforms designed to wrest control of the process from insiders at smoke-filled nominating conventions, political parties still exert tremendous influence on who makes it to general elections. They do so partly through “invisible primaries,” the authors posited—think of how the Republican establishment coalesced around George W. Bush in 2000, long before any ballots were cast, presenting him as a fait accompli to voters who’d scarcely started to think about the election; or how insider Democrats elevated Hillary Clinton this election cycle.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Three Atlantic staffers discuss “Home,” the second episode of the sixth season.
Every week for the sixth season of Game of Thrones, Christopher Orr, Spencer Kornhaber, and Lenika Cruz will be discussing new episodes of the HBO drama. Because no screeners are being made available to critics in advance this year, we'll be posting our thoughts in installments.
The Massachusetts Supreme Court will decide whether a local shrine should be tax-exempt—a decision that could have broad implications for faith organizations in America.
Property-tax battles are rarely sexy. But a case now in front of the Massachusetts Supreme Judicial Court, about whether the 21 religious brothers and sisters who run the Shrine of Our Lady of LaSalette in Attleboro should have to pay taxes, could have huge repercussions. The Court’s decision will be an important part of the ongoing debate in America about who defines religious practice—believers or bureaucrats—and whether religion itself should be afforded a special place under the law.
The case centers on a colonial-era law in Massachusetts that exempts religious houses of worship and parsonages from property taxes if they are used for religious worship or instruction. The shrine has enjoyed this perk since its founding in 1953. But in recent years, the City of Attleboro, nestled between Providence and Boston, has faced a tightening budget. It began looking to see where it could collect more revenue. The shrine, the only major tourist attraction in town, was an obvious target for tax collectors.