Happiness and sanity are an impossible combination, Mark Twain once remarked. Fortunately, a new study suggests, happiness and healthy blood pressure make better bedfellows. The authors, both economists, found a strong inverse correlation between reported rates of hypertension and well-being in 16 European countries. Nations whose citizens report the highest levels of satisfaction with their lives (Denmark, the Netherlands, Sweden) also report the lowest incidence of high blood pressure, while countries with the lowest rates of happiness (Italy, Portugal, Germany) suffer elevated rates of high blood pressure. Nearly half the respondents in countries with the lowest hypertension rates pronounced themselves very satisfied with life, while less than a quarter of respondents in countries with the highest blood pressure said the same. The authors caution that they don’t have a good explanation for the link between the two sets of data. But they argue that as governments look to develop policies that expand well-being rather than wealth, they should factor blood pressure into their calculations.
—“Hypertension and Happiness Across Nations,” David G. Blanchflower and Andrew J. Oswald, National Bureau of Economic Research
In April, tens of thousands of Pakistanis protested in the streets of Karachi after an Islamic cleric, the head of one of the country’s many madrasas (religious schools), called for the government to impose Islamic law throughout the coun-try. According to a new report by the International Crisis Group on madrasas and violence in Pakistan’s largest city, the protesters had reason to be worried. Since 9/11, Pakistani President Pervez Musharraf has repeatedly promised to crack down on the madrasas that train new generations of would-be jihadists, but his rhetoric has produced little action. Each promise has been “invariably followed by retreat,” states the report, and five years after his government launched the ambitious Madrasa Reform Project, the effort is “in shambles”: Mosques and madrasas are training and exporting fighters to Afghanistan and Kashmir, illegally seizing land to expand their sway over urban neighborhoods, and calling for jihad and sectarian violence. In Karachi, which may be home to more than 1,000 madrasas, suicide attacks in 2006 killed a U.S. diplomat, the country’s most prominent Shia political leader, and the entire leadership of a Sunni militant group. The report recommends that Musharraf follow through on his pledge to crack down on extremism, though it acknowledges the political risks: National elections are coming up this fall, and he depends on religious voters for much of his political support. But without meaningful change in the country’s educational system, the report warns, “the madrasas and the violent extremism they encourage are likely to become even more powerful.”
—“Pakistan: Karachi’s Madrasas and Violent Extremism,” International Crisis Group
For avian-flu watchers, the pandemic of 1918 shows what a new outbreak could mean for the world, but a study published by the National Academy of Sciences suggests that it may also offer lessons on how to contain avian flu. The U.S. doesn’t have enough vaccine to counter an outbreak, but the report argues that there’s more to readiness than vaccine stockpiles. The authors looked at “nonpharmaceutical interventions”—closing schools and movie theaters, banning public gatherings, quarantining infected households—used in 17 U.S. cities in 1918, and found a strong correlation between aggressive NPIs and lower transmission rates. Timing mattered, too: Cities that put multiple restrictions on their citizens early in the epidemic had peak death rates as much as 50 percent lower than cities that waited longer or never imposed any restrictions. The study also points out that cities that imposed NPIs suffered a second wave of infection once the restrictions were relaxed; had the restrictions been in place for more than eight weeks (the longest most cities imposed them), death rates might have been even lower.
—“Public Health Interventions and Epidemic Intensity During the 1918 Influenza Pandemic,” R. Hatchett, C. Mecher, and M. Lipsitch, Proceedings of the National Academy of Sciences
Although we live in the information age, a new Pew Research Center study suggests that the American public is no better informed than it was before round-the-clock cable news and the Internet invaded our homes. After surveying more than 1,500 Americans to find out where they get their information and how much they know about current affairs, researchers concluded that Americans “are about as aware of major news events” as they have been for the last two decades. More people today know that the chief justice of the Supreme Court is conservative than did in 1989, for instance, but fewer know that the United States has a trade deficit. Half the country can identify Nancy Pelosi as the speaker of the House, whereas in 1989 just 14 percent could correctly identify Tom Foley as the speaker; on the other hand, Americans today have more trouble identifying the U.S. vice president and the Russian president than they did in the era of Dan Quayle and Boris Yeltsin. The most knowledgeable Americans were those who got their news from the Web sites of major papers and those who watched programs like The Colbert Report or The Daily Show; they correctly answered 54 percent of the questions about current affairs, while regular viewers of local TV news and network morning shows got only about 35 percent right. The survey found that there isn’t necessarily a trade-off between hard-news knowledge and pop-culture savvy: Respondents who demonstrated a “high” knowledge of politics and world events were also adept at identifying celebrities such as Beyoncé Knowles. And while it’s hard to know which sources provide the best information, the report notes that well-informed people gather their news from an average of 7.0 sources—more than the average of 4.6.
Early America was so thick with buffalo that explorers said the land seemed draped with “one black robe.” By 1900, fewer than a hundred buffalo remained. But it was European demand, more than Manifest Destiny, that nearly drove the continent’s most majestic native species to extinction, according to a paper by an economist from the University of Calgary. Until about 1870, the author argues, American hunters killed buffalo only in moderate numbers, usually for their meat and hides. But in 1871, English and German tanners figured out how to easily and cheaply turn buffalo hides into boot soles and machine belts, and, the author suggests, as European armies reequipped after the Franco-Prussian War, the profits suddenly available to hunters set off a decade-long buffalo bloodbath, during which most of the meat was left to rot on the prairie. America, still recovering from the Civil War, needed the money, so the government let the massacre continue—a reckless act that the paper likens to the tendency of today’s developing countries to strip their land of natural resources to satisfy their trading partners in the developed world.
—“Buffalo Hunt: International Trade and the Virtual Extinction of the North American Bison,” M. Scott Taylor, National Bureau of Economic Research
It’s a dictum of the dismal science: “When America sneezes, the rest of the world catches a cold.” But the International Monetary Fund’s most recent World Economic Outlook suggests that may no longer be the case: While the U.S. remains a dominant force in the global marketplace, many countries, the report states, may be “decoupling” sufficiently from the American economy that they’re gaining some protection from painful chain reactions. The IMF’s researchers also suggest that America’s disruptive impact may have been overstated: “Past episodes of highly synchronized growth declines across the globe”—like those created by the oil-price shocks of the mid-1970s, or the bursting of the tech bubble in 2000—“were not primarily the result of developments specific to the United States, but rather were caused by factors that affected many countries at the same time.” The report notes that one reason the global economy continues to hum along, even as growth slows in the U.S., is that America’s current sluggishness is driven by slumps in its housing and manufacturing sectors, which have a limited effect on other major industrial countries. Of course, it points out, given that America accounts for about a fifth of the world’s economic activity—a percentage that has changed little in the past 30 years—its sneezes remain “relevant.” But the rest of the world appears to be building a sturdier immune system.
—“World Economic Outlook: Spillovers and Cycles in the Global Economy,” International Monetary Fund
When it comes to catching a kid in a fib, teachers are among the savviest lie detectors, according to a recent study published in Behavioral Sciences and the Law. The researchers selected a group of preschoolers and left each of them seated alone in a room, asking them not to peek at a toy that was behind them, out of their view. The researchers videotaped their actions, then asked each child, “Did you peek?” The responses were shown to 64 adults selected from summer courses at Rutgers University, who were asked to determine whether each child was telling the truth. The adults’ scores varied widely—they were right 12 percent to 84 percent of the time—but their average score was just 41 percent; chance alone would have given them 50 percent. (Most adults, including parents, erred on the side of suspicion, believing some children were lying when they were being honest.) But one group of adults—those who work with children professionally, including teachers and child psychologists—routinely outperformed the rest of the sample. More than a third of the professionals detected the liars at least 60 percent of the time; only one nonprofessional was able to match that rate.
—“Adults’ Ability to Detect Children’s Lying,” Angela M. Crossman and Michael Lewis, Behavioral Sciences and the Law
Both common sense and informal surveys of U.S. soldiers and their spouses have suggested that longer, more frequent deployments to Iraq and Afghanistan, combined with the stress and uncertainty of combat, make military marriages more likely than ever to end in divorce. The Rand Corporation recently prepared a study for the Pentagon on the effect of divorce on military performance and retention rates, and the results challenge the conventional wisdom: The study finds no significant increase of divorce in the military during recent conflicts. Researchers examined personnel records for every member of the U.S. military serving between fiscal years 1996 and 2005 (more than 6 million people), and found, to their surprise, that while the divorce rate has risen steadily since the start of the war in 2001, it’s still roughly the same as it was in 1996, when the military was under significantly less stress. So, can married soldiers breathe easier when they’re parted from their spouses? The men can, but the study found that the risk of divorce for female service members, particularly enlisted women, is several times the risk for their male counterparts. Men, it seems, have a harder time waiting at home for the return of their soldier wives.
—“Families Under Stress: An Assessment of Data, Theory, and Research on Marriage and Divorce in the Military,” Benjamin R. Karney and John S. Crown, Rand Corporation
Young people are generally full of themselves, but a new study suggests that today’s kids are far more self-centered than preceding generations. A team of five university psychologists analyzed the results of the Narcissistic Personality Inventory, a 40-question survey administered to 16,475 current and recent college students nationwide between 1982 and 2006; the test asked students to agree or disagree with statements like “I think I am a special person” and “If I ruled the world, it would be a better place.” The results, the authors argue, illustrate a steady increase in narcissism—a “positive and inflated view of the self.” Overall, almost two-thirds of the most recent sample display a higher level of narcissism than the 1982 average. Why the increase? The researchers speculate that technology may have something to do with it. Narcissism is especially acute among students born after 1982, the cohort most likely to use “self-focused” Web sites like MySpace and YouTube. Whatever the cause, the researchers argue that increased narcissism can have pernicious effects, on the individual and on society. They cite previous studies showing that narcissists have trouble forming meaningful relationships, tend to be materialistic, and are prone to higher levels of infidelity, substance abuse, and violence.