I finished Antony Beevor's majestic The Second World War last night. I immediately poured myself a drink. Beevor's book is great look at how we think about "good" and "evil." I found it very easy to name "evil," and a lot harder to name "good."
This is evil:
Many prisoners of the Japanese had suffered a particularly gruesome and cruel fate. General MacArthur had given Australian forces the dispiriting task of clearing New Guinea and Borneo of the remaining pockets of Japanese. It became clear from all the reports collected later by U.S. authorities and the Australian War Crimes Section that the 'widespread practice of cannibalism by Japanese soldiers in the Asia-Pacific war was something more than merely random incidents perpetrated by individuals or small groups subject to extreme conditions. The testimonies indicate that cannibalism was a systematic and organized military strategy.
The practice of treating prisoners as 'human cattle' had not come about from a collapse of discipline. It was usually directed by officers. Apart from local people, victims of cannibalism included Papuan soldiers, Australians, Americans, and Indian prisoners of war who had refused to join the Indian National Army. At the end of the war, their Japanese captors had kept the Indians alive so that they could butcher them to eat one at a time. Even the inhumanity of the Nazis' Hunger Plan in the east never descended to such levels.
Because the subject was so upsetting to families of soldiers who had died in the Pacific War, the Allies suppressed all information on the subject, and cannibalism never featured as a crime at the Tokyo War Crimes Tribunal in 1946.
It's not just the practice of cannibalism, but that the cannibalism proceeded from notions of racial superiority, militarism, and empire. The Nazis, who attempted to turn human body parts into consumer goods, were not much different.
But is this good?
The mass of incendiaries raining down in a tighter pattern than usual on the eastern side of the city accelerated the conglomeration of individual fires into one gigantic furnace. This created a chimney or volcano of heat which shot into the sky and sucked in hurricane force winds at ground level. This fanned the roaring flames still further. At 17,000 feet, the air-crew could smell roasting flesh.
On the ground, the blast of hot air tore off clothes, stripping people naked and setting their hair ablaze. Flesh was desiccated, leaving it like pemmican. As in Wuppertal, tarmac boiled and people became glued to it like insects on a flypaper. Houses would explode into a blaze in a moment. The fire service was rapidly overwhelmed. Those civilians who stayed in cellars suffocated or died from smoke inhalation or carbon-monoxide poisoning.
They, according to the Hamburg authorities later, represented between 70 and 80 per cent of the 40,000 people who died. Many of the other bodies were so carbonized that they were never recovered...
Harris's attempt to break German morale had failed. Yet he still refused to admit defeat and he certainly refused to recant. He despised government attempts to whitewash the bombing campaign by claiming that the RAF was going only for military targets and that civilian deaths were unavoidable. He simply regarded industrial workers and their housing as legitimate targets in a modern militarized state. He rejected any idea that they should be 'ashamed of area bombing.'
This is very clearly terrorism. British Prime Minister Winston Churchill privately acknowledged as much, noting that RAF Bomber Command was often bombing "simply for the sake of increasing the terror, though under other pretexts." Arthur Harris (Commander-in-Chief of Bomber Command) rejected the idea that you could separate German civilians from the German military.
When the entire state is mobilized to conquest, what is a civilian? In the Pacific theater, Curtis Lemay used similar logic in ordering the firebombing of Tokyo. (100,00 dead.)
I don't suggest an equivalence here. The big difference between the Nazi embrace of terrorism and the British/American embrace is that there was an actual debate. In Nazi Germany, those who debated were seen as weak, insufficiently loyal, and often executed.
But that isn't enough. Do we get to call ourselves "democratic" and then judge ourselves by a Nazi standard? And there is something more -- what you see is the Americans and British forces throughout the War enacting harsher and harsher measures. Faced with the evil of the Nazis, or the evils of Japanese imperialism, we find the tools of evil more alluring. By the time American forces get to the Ardennes, they are not taking prisoners. And looking at Nazi tactics -- "surrendering" and then shooting -- can we say we'd do anything different?
This is what is ultimately most troubling for me about Beevor's work. He -- all at once --catalogues all the flaws of the Allies, but robs you of your moral superiority. How should we think about the Soviet Union, which, among "The Big Three," bore the brunt of the Nazi assault? On one page Beevor will profile their heroic stand against an Army thought sought to starve them out of existence. On another he will profile that same Army raping its way to Berlin. How do you think about the subjugators of Poland and the liberators of Auschwitz, when it's the same Army?
Perhaps in the same way you think about a Union Army enforcing emancipation, only to turn around and enforce the pilfering of Native American land. Perhaps in the same way you think about Britain holding out against the Nazis, while ruthlessly warring against Kenyans fighting for independence. During the Bush years there was a lot of debate about the usefulness of the concept of "evil." I don't have much trouble naming "forces for evil." What I have trouble with is naming "forces for good," to say nothing of "good wars." Indeed, it's very easy to name "an Axis of Evil." It's significantly harder to name a "Alliance for Good." Perhaps I can go with "forces for betterment" or "necessary wars." Certainly the Allied victory presented a "better" world then what Hitler promised.
I find that it's common for people who fight "good wars" to gild the cause in humanitarianism. And sometimes there are real, actual humanitarian outcomes -- ending the Holocaust, the destruction of the American slave society. But we didn't join the the Second World War to end the Holocaust. And the North only joined the war against slavery when it became clear that it was the only way to reunification.
I am sorry for the confusion of all this. I am still thinking a lot of this through. Je ne sais pas.
Ta-Nehisi Coates is a national correspondent at The Atlantic, where he writes about culture, politics, and social issues. He is the author of The Beautiful Struggle and the forthcoming Between the World and Me.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”
A tattooed, profanity-loving Lutheran pastor believes young people are drawn to Jesus, tradition, and brokenness.
“When Christians really critique me for using salty language, I literally don’t give a shit.”
This is what it’s like to talk to Nadia Bolz-Weber, the tattooed Lutheran pastor, former addict, and head of a Denver church that’s 250 members strong. She’s frank and charming, and yes, she tends to cuss—colorful words pepper her new book, Accidental Saints. But she also doesn’t put a lot of stock in her own schtick.
“Oh, here’s this tattooed pastor who is a recovering alcoholic who used to be a stand-up comic—that’s interesting for like five minutes,” she said. “The fact that people want to hear from me—that, I really feel, has less to do with me and more to do with a Zeitgeist issue.”
The tension between religious liberty and same-sex marriage may eventually come to a head in the courts, but probably not through the Kentucky clerk’s case.
As Rowan County clerk Kim Davis crawls further and further out on a limb, Supreme Court experts agree that she has little chance of prevailing. District Judge David Bunning, on August 12 ordered Davis, in her capacity as county clerk, to issue marriage licenses to all couples who meet the statutory criteria for marriage in Kentucky—a definition that, since the Court’s landmark decision in Obergefell v. Hodges, includes same-sex couples.
Davis has refused, citing “the authority of God.” The U.S. Supreme Court, without comment, denied her emergency request for a stay. This throws the case back to the Sixth Circuit, which will hear the appeal of Judge Bunning’s order. Assuming she loses in the Sixth Circuit—a fairly good assumption—she would then have the alternative of petitioning the Supreme Court to hear her religious freedom claim. The Court will eventually hear a case about religious freedom and same-sex marriage, but I don’t think it will be this one.
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.