Primary Sources

The kind of body count Americans can tolerate; the aggrieved boyfriend as terrorist; why the "dirty bomb" threat is real; finally—the truth about bullies and their victims

Casualties of War

The American public can tolerate far higher casualties than is commonly thought, particularly if they are incurred in humanitarian missions; what it cannot tolerate is the prospect of defeat. This thesis, originally advanced by Peter Feaver and Christopher Gelpi in their book Choosing Your Battles: American Civil-Military Relations and the Use of Force (2004), is borne out by their recent analysis of polling data regarding the Iraq War and its aftermath. American reactions to combat casualties, the authors argue, tend to vary depending on the "frame" in which those casualties are viewed. If the mission appears to be going well, as in post-Inchon Korea, Panama, and the 1991 Gulf War, public approval for the President tends to rise even as casualties increase, in what Feaver, Gelpi, and the study's third co-author, Jason Reifler, term the "rally round the flag effect." If the public perceives the mission as going badly, however, as happened in Vietnam after the Tet Offensive of 1968 and in Lebanon after the Beirut barracks bombing of 1983, then further casualties will produce a backlash and a drop in presidential approval ratings. Thus far this pattern has held true for Iraq: President Bush's approval ratings rose during combat operations (even as the number of casualties soared) and then fell once guerrilla insurgency, with its suggestion of quagmire, replaced battlefield combat as what the authors call the "dominant media frame" for Iraq. The President's principal political challenge, then, seems to be persuading the country that American deaths are the price of a final victory and not a sign of a looming debacle.

—"Paying the Human Costs of War: Iraq 2003," Christopher Gelpi, Peter Feaver, Jason Reifler, Triangle Institute for Security Studies and Duke University

One Man's Terrorist ...

Evaluating the success of American law enforcement's post-9/11 battle with terrorism may depend on just how one defines the term "terrorist," according to a recent analysis of Justice Department data. In the two-year period following the World Trade Center attacks, federal investigative agencies referred significantly more cases classified as "terrorism" (3,500) to prosecutors than in the two years prior to the attacks. More such cases (730) were also prosecuted, and more convictions were won (341). Yet during the two years after the attacks, only sixteen people were sentenced to five years or more in prison for terrorism—fewer than during the two years preceding 9/11. Moreover, this "terrorist" tally includes not only the would-be shoe bomber Richard Reid but also such threats to national security as a Georgia man who detonated a pipe bomb in his girlfriend's empty car and a Texas man who conspired from his prison cell to assassinate a federal judge. Other facts cast additional doubt on the efficacy of the Justice Department's wide net: for instance, federal prosecutors deemed only 41 percent of the terrorism referrals they received worth pursuing (whereas 68 percent of all criminal cases referred to the department were prosecuted); and the majority of terrorism convictions (276 out of 341) resulted in no prison at all or sentences of less than a year. Even among those convicted within the narrower category of "international terrorism," the median sentence was fourteen days—the stuff of traffic violations, not al-Qaeda operations.

"Criminal Terrorism Enforcement Since the 9/11/01 Attacks," Transactional Records Access Clearinghouse, Syracuse University

The "Dirty Bomb" Scenario

After Jose Padilla was arrested in 2002 for allegedly planning a "dirty bomb" attack, Attorney General John Ashcroft described such a device—conventional explosives wrapped around radioactive contaminants—as an instrument of "mass death and injury." A team at the National Defense University has spent the past year assessing the real nature of the threat, and its findings are disturbing: a successful dirty-bomb attack might contaminate an area the size of the Washington Mall, and the "maximum credible events" envisioned by the report could kill dozens or hundreds, sicken thousands, and ruin a metropolis (contaminated buildings would probably have to be razed and the debris carted off, along with a meter of topsoil, in what experts call "muck and truck"). Worse, terrorists might achieve the "dirty" results without the bomb, by quietly releasing radiation through smoke or an aerosol, so that the damage would be done before the attack was even noticed. To explore the consequences of such an attack, researchers looked at a real-life disaster that occurred in the city of Goiânia, Brazil, in 1987. Junk-metal pirates salvaged cesium from an abandoned radiation lab and passed the "glowing blue material" to family and friends. The cesium eventually spread through buses, ambulances, animal fur, bars, and restaurants, until 112,000 Brazilians were tested for radiation in an Olympic-size soccer stadium; 249 people were determined to have been contaminated, forty-nine were hospitalized, and five died. (Goiânia, bitter from the ensuing economic isolation and stigma, added the universal insignia for radioactivity to its flag.)

"Dirty Bombs: The Threat Revisited," Peter D. Zimmerman with Cheryl Loeb, National Defense University

The Thick Red Line

In Chicago an African-American earning more than $90,480 a year is 2.3 times as likely to be turned down for a home loan as a white person making less than $37,700. This remarkable finding, contained in a new report from the Association of Community Organizations for Reform Now (ACORN), helps explain why less than half of black families own their homes (compared with three quarters of white families), and why this racial divide in homeownership is likely to increase. In 2002 African-Americans overall were 2.38 times as likely to have a mortgage application denied as were whites (up from 2.06 in 1997), and this disparity actually increases with income level: a low-income African-American is only 1.55 times as likely to be denied a mortgage as a low-income white, whereas an upper-income African-American is 2.83 times as likely to be turned down as a white of similar income. The good news: mortgage-denial rates for blacks overall have decreased (from 57 percent to 30 percent) since 1997, and the number of home loans blacks receive has increased (from 139,544 a year to 189,817). But 72 percent of this increase, ACORN points out, consists of higher-interest loans, suggesting not only that blacks will continue to be far less likely than whites to own homes, but also that those who do will be more heavily encumbered by debt.

"The Great Divide: Home Purchase Mortgage Lending Nationally and in 115 Metropolitan Areas," Association of Community Organizations for Reform Now

My Big Fat American Child

Maybe those lawsuits against McDonald's weren't so silly after all. Two new studies find that American teens are heavier than their counterparts in most other industrialized countries, and that fast food probably bears much of the blame. According to a study of fifteen-year-olds in fifteen developed nations, 15 percent of girls and nearly 14 percent of boys in the United States are classified as obese; the next highest numbers (5.5 percent of girls and almost 11 percent of boys) come from Greece. (In slender France the numbers are four and not quite three percent, and in skinny Lithuania two and one percent.) Meanwhile, another recent study finds that nearly a third of U.S. children aged four to nineteen eat at least one fast-food meal daily—and that they take in 187 calories more a day than those who don't, for an average of about six extra pounds a year.

"Body Mass Index and Overweight in Adolescents in 13 European Countries, Israel, and the United States," Inge Lissau et al., Archives of Pediatrics & Adolescent Medicine; "Effects of Fast-Food Consumption on Energy Intake and Diet Quality Among Children in a National Household Survey," Shanthy Bowman et al., Pediatrics

Machiavelli for Twelve-Year-Olds

In news that will doubtless come as no surprise to anyone who has survived junior high, a study published in the December issue of Pediatrics finds that sixth-grade bullies tend to be popular and psychologically strong, and are often viewed as the "coolest" by their classmates. Their victims, on the other hand, are typically miserable: lonely, unstable, and socially marginalized. Commonsensical as these findings seem, they fly in the face of previous research—dutifully repeated by countless parents to their bullied offspring—suggesting that bullies actually suffer from depression and low self-esteem. Those earlier findings, it seems, mistakenly relied on "self-reports of being a bully" while ignoring the reality that, as the authors of the Pediatrics report dryly point out, "it is unlikely that bullies as a group provide accurate self-reports of how they treat others." When researchers instead use a system that relies on the consensus of a large group of students, it becomes clear that the only bullies likely to suffer depression are those who are both bullies and victims—victims, that is, who take out their misery on students unlucky enough to be even further down the junior high food chain.

"Bullying Among Young Adolescents: The Strong, the Weak, and the Troubled," Jaana Juvonen, Sandra Graham, Mark A. Schuster, Pediatrics