Primary Sources
Selections from reports, studies, and other documents. This month: Osama bin Laden and Jacques Chirac voted able "to do the right thing"; the coming suburban ghetto?; why the Vikings would have liked global warming
What do Osama bin Laden and Jacques Chirac have in common? In a recent Pew Research Center sampling of post-Iraq War attitudes around the world, a majority of Jordanians expressed confidence in both men's ability "to do the right thing." And it wasn't just Jordan: bin Laden inspired the faith of majorities in Indonesia and the Palestinian territories, as Chirac did in Lebanon and Morocco. (In the Islamic world George W. Bush was trusted by a majority of those surveyed only in Kuwait.) In every Muslim-majority country surveyed, pro-U.S. sentiment in general has fallen precipitously over the past year, with little sign of improvement since the overthrow of Saddam Hussein. In none of these countries except Nigeria and Kuwait does pro-American sentiment break 30 percent—and in Jordan and the Palestinian territories it has dropped all the way to one percent.
—"Pew Global Attitudes Project: Views of a Changing World," the Pew Research Center
Consider almost any recent or ongoing civil war and you're likely to find somebody blaming "ethnic tensions" or "tribal rivalries" for it. This is largely unfair, according to a World Bank review of fifty-two such conflicts from 1960 to 1999, which uses a statistician's approach to argue that the causes of civil strife are more often economic than ethnic. All other factors being equal, a low-income country has a 17.1 percent chance of falling into civil war in a given five-year period. Economic development significantly reduces that chance: if two percent growth is sustained for a decade, the risk of war drops to 12.3 percent. Economic diversification also helps: the odds of civil war fall to 11 percent when primary-commodity exports (notably oil) account for 10 percent or less of gross domestic product, but they rise to a remarkable 33 percent in countries where such exports account for more than 30 percent of GDP. Ethnic diversity has a weaker impact. Countries so diverse that no single group can dominate have a slightly below average probability of civil strife, and in societies in which one ethnic group holds a majority, the chances of war are higher than average—but only just. The single best (if almost tautological) predictor of the likelihood of civil war turns out to be neither economics nor demographics but history: a country that has just emerged from civil war runs a 44 percent risk of a return to conflict within five years.
—"Breaking the Conflict Trap: Civil War and Development Policy," the World Bank
Is economic aid a more effective weapon against terrorism than military force? It's an attractive and much talked about idea, but a new study by the RAND Corporation suggests that if funds are insufficient or poorly distributed (as they very often are), aid programs run the risk of backfiring and increasing support for terrorism. Examining three recent cases in which development policies sought to reduce terrorism (in the southern Philippines, the territories occupied by Israel, and Northern Ireland), the study concludes that the first two efforts failed utterly and were ineffective in inhibiting political violence. Both programs suffered from corruption and favoritism, and the sums involved were relatively small—in the Philippines averaging just $6.00 per capita per year in the targeted areas. Only in Northern Ireland did development money calm political violence, because only there was aid delivered in a transparent and evenhanded fashion, and on a large enough scale (more than $500 per capita annually). Can the United States, as it dispenses aid in a hostile Muslim world, improve on this record? Perhaps—but the report's authors note that during the recent war in Afghanistan, the United States pledged $150 million to help alleviate poverty in Uzbekistan, a Muslim ally. That sum translates to a familiar number: $6.00 per capita per year.
—"Terrorism & Development: Using Social and Economic Development to Inhibit a Resurgence of Terrorism," the RAND Corporation
The Supreme Court has upheld at least some varieties of race-based affirmative action, but perhaps college administrators should be worrying less about their students' color and more about their socioeconomic class. According to an ambitious study from the Century Foundation, 74 percent of students at America's 146 most selective colleges come from the most privileged quarter of the U.S. population, whereas just three percent emerge from the most disadvantaged quarter. Blacks and Hispanics are also under-represented, but far less starkly: they make up 12 percent of students at these colleges, compared with 28 percent of the country's college-age population. Overall, the number of black and Hispanic students is four times as large as the number of students from the bottom fourth of America's socioeconomic ladder who attend top-tier schools—which raises the possibility that middle- and upper-income students from minority groups could be benefiting from affirmative action at the expense of poorer, more disadvantaged whites. The study's authors are careful to say that socioeconomic diversity should serve as a complement to and not a replacement for racial balance. But their numbers suggest that if colleges are serious about admitting truly diverse student bodies, a formal system of class-based affirmative action should be at least as high a priority as the ongoing defense of racial preferences.
—"Socioeconomic Status, Race/Ethnicity, and Selective College Admissions," the Century Foundation
The good news, according to a recent study by the Brookings Institution, is that the number of U.S. neighborhoods with a high poverty rate declined dramatically in the 1990s. The bad news lurking in the study is that poverty hasn't gone away; indeed, in some cases it has migrated from urban centers to inner-ring suburbs. The study finds that the number of neighborhoods with a poverty rate of 40 percent or higher dropped by more than a quarter in the 1990s, from 3,417 to 2,510, significantly reducing the amount of fertile ground for social ills such as gang violence and poorly performing schools. But the study also warns that America's cities may have entered a period of transition, in which the poor are gradually shifting out of a gentrifying urban core only to become clustered elsewhere. Indeed, poverty rates in some inner-ring suburbs rose during the 1990s, creating something of a bull's-eye pattern when plotted on a map—and hinting at a future for suburbia that is far more dismal than the strip malls and SUVs of today.
—"Stunning Progress, Hidden Problems: The Dramatic Decline of Concentrated Poverty in the 1990s," the Brookings Institution
Just how hot was the twentieth century? Warmer than its immediate predecessors, but probably colder than the Medieval Warm Period, when the Vikings colonized Greenland and olive trees flourished as far north as Germany. After conducting a comprehensive review of more than 100 studies of coral, glacial ice cores, tree rings, and other indicators of climatic history, a meta-study published earlier this year in the journal Climate Research concludes that despite widespread concern over global climate change, the twentieth century was not unusually warm by historical standards. (This year's extended East Coast winter might suggest that so far the same holds true for the twenty-first.) The study's authors admit that the historical data on climate change are often scattered, particularly in the Southern Hemisphere, and they are also quick to note—no doubt anticipating environmentalist critiques—that the lack of extreme warmth in the twentieth century "does not argue against human impacts on local and regional scales." But even with these caveats, the study makes a strong case that human societies have always been able to cope with significant climatic shifts. Or almost always; as the Medieval Warm Period came to an end, so did Viking life in Greenland.
—"Proxy Climatic and Environmental Changes of the Past 1000 Years," Willie Soon and Sallie Baliunas, Climate Research
Sure, action video games may desensitize kids to violence and mayhem, but—according to a study conducted at the University of Rochester—they also improve visual skills. Researchers surveyed male undergraduates who reported playing games such as Grand Theft Auto III and Halo several times a week (only one qualified female subject could be found) and determined that frequent gamers tend to notice fast-vanishing objects more often than nongamers, and are better at identifying letters that appear quickly and then disappear. The study's authors have suggested that these findings could be put to use in designing training programs for fighter pilots and other military personnel, or in helping to rehabilitate the visually impaired—which was widely construed in the press to mean that video games are good for you. For the man on the street, though, the skill set improved by playing video games would seem to be useful primarily for the purpose of ... playing video games.
—"Action Video Game Modifies Visual Selective Attention," C. Shawn Green and Daphne Bavelier, Nature