The four-minute radio address ended a war, obliterated the 20-year imperial ideology, and began Japan's rebirth into what it is today.
On this day in 1945, one week after atomic bombs had obliterated the cities of Hiroshima and then Nagasaki, radios across Japan crackled with another shocking announcement, one that would come to change the course of Japanese history perhaps as much as did the atomic bombs Little Boy and Fat Man. At noon, Emperor Hirohito spoke directly to his subjects for the first time in his reign. His announcement would shock Japan, but it would also transform it, altering in a few short minutes the entire mission of the Japanese nation in ways that it, and the world, still feel today.
Hirohito was more than Japan's head of state. He was its divine monarch and the personification of both the nation and its spiritual imperative for imperial expansion, "the literally living embodiment of Japan past and present, a paradigm of moral excellence," according to Herbert Bix's Pulitzer-winning biography. Hirohito both embodied and galvanized imperial Japan's race-based nationalism, its radically militarist ideology that had led it to sow war and much worse across Asia.
Hirohito personally sat, according to Bix, "at the center of his nation's political, military, and spiritual life in the broadest and deepest sense" during the expansion that "cost nearly 20 million Asian lives, more than 3.1 million Japanese lives, and more than 60,000 Western Allied lives." The Pacific War was, in the ultra-nationalist ideology that gripped Japan for the first half of the 20th century, a "holy war," and waged in Hirohito's name.
Japan's war-rattled civilians had good reason to fear that Hirohito's radio address might bring terrible news. Surrender was officially forbidden in the Japanese military, and in the closing years of the war, Japanese civilians were told that they too might have to choose death to protect the dignity of the nation and the sanctity of the imperial ideology. "The hundred million," the propaganda's term for the civilians at home, might have to embrace a death that would be beautiful in its tragedy, "like shattered jewels."
As the American military pressed in, Japan's war machine had turned inward, as John W. Dower documented in his masterful, Pulitzer-winning history. "Japanese died in hopeless suicide charges, starved to death in the field, killed their own wounded rather than let them fall into enemy hands, and murdered their civilian compatriots in places such as Saipan and Okinawa," he wrote. At home, "They watched helplessly as fire bombs destroyed their cities -- all the while listening to their leaders natter on about how it might be necessary for the 'hundred million' all to die 'like shattered jewels.'"
And this is what many Japanese feared their emperor would ask of them, Dower wrote: to "fight to the bitter end and die" as they'd been indoctrinated, or to end the imperial mission by their own hands in ritual suicide rather than allow foreigners that right.
When the emperor's voice beamed across the country (audio here), and out beyond it on shortwave signals for the troops stationed throughout East Asia, it was the first time that the vast majority of his subjects heard him. High-pitched, stilted, and in a classical Japanese more difficult to understand than what most people spoke in conversation. Still, the message was clear: surrender. The unthinkable.
"We have ordered our government to communicate to the governments of the United States, Great Britain, China, and the Soviet Union that our empire accepts the provisions of their joint declaration," he said, referencing the allies' demand for unconditional surrender. But perhaps even more surprising than Hirohito's call for capitulation were the terms he used, which seemingly reversed the entire ideology of war and expansion that had been synonymous with his rule.
"To strive for the common prosperity and happiness of all nations, as well as the security and wellbeing of our subjects, is the solemn obligation which has been handed down by our imperial ancestors and which lies close to our heart," he explained. "The enemy has begun to employ a new and most cruel bomb, the power of which to do damage is, indeed, incalculable, taking the toll of many innocent lives. Should we continue to fight, not only would it result in an ultimate collapse and obliteration of the Japanese nation, but also it would lead to the total extinction of human civilization."
He declared that the military would be disarmed, suggesting this would happen not because disarmament had been forced upon Japan (it had), but because Japan had made the difficult choice to privilege peace. It wasn't wholly true, but it helped replace the imperial ideology of war with an ideology of peace that persists to this day.
Hirohito, after years of indirectly pressing his citizens to carry the burdens of war and imperialism, of an ideology that demanded international primacy, now asked them directly to carry the very different burdens of peace, humility, and lower status. "The hardships and sufferings to which our nation is to be subjected hereafter will be certainly great," he warned. "However, it is according to the dictates of time and fate that we have resolved to pave the way for a grand peace for all the generations to come by enduring the unendurable and suffering what is not sufferable." He ended by urging his long-suffering citizens to "Cultivate the ways of rectitude, foster nobility of spirit, and work with resolution" so as to "keep pace with the progress of the world."
To "endure the unendurable and suffer what is not sufferable" would become a sort of national motto in the following seven years of American occupation, "quoted times beyond counting" in Japanese media, according to Dower, such that it "carried a clear sense of purpose." It came to describe not just the humiliation of defeat, the pain of accepting what 20 years of ultranationalism had indoctrinated into Japanese as the ultimate pain, but Japan's struggle to find an entire new identity and place in the world.
"Enduring the unendurable" also meant surviving Japan's near-total collapse. The allied bombing campaign had destroyed one third of the nation's wealth, according to the American occupation authority's estimates, roughly comparable to the U.S. great depression. Urban living standards plummeted to 35 percent of pre-war levels. In the country's 60 or so largest cities, bombing had destroyed nearly half of the structures, rendering 30 percent of its residents immediately homeless. Food became scarce, and Dower documents some Japanese cities recommending "emergency diets" of "acorns, grain husks, peanut shells, and sawdust" as well as "silkworm cocoons, worms ... or a powder made by drying the blood of cows, horses, and pigs." Disease and starvation spread.
Meanwhile, millions of Japanese soldiers and colonists abroad found, with the empire's collapse, that they had no way to go home and little or no rights in the newly independent colonies. As many as 68,000 Japanese in China were conscripted into the communist insurgency, Dower reports, and around 1.6 million Japanese in the Soviet Union were made to contribute labor. Of those, 300,000 never returned home. In the 1980s, the Soviet government released the names of 46,000 who had been bured in Siberia; the rest have never been accounted for.
Hirohito's historic address marked the end of World War Two and the end of imperial Japan's ultranationalist ideology, but it was also a beginning: of the American occupation and of a new Japan. "The losers wished to both forget the past and to transcend it," Dower wrote, and Japan set about to rise out of the ashes of its own destruction, this time with ideals and goals almost the polar opposite of before. "The ideals of peace and democracy took root in Japan -- not as a borrowed ideology or imposed vision, but as a lived experience and seized opportunity."
In a generation, Japan achieved both full democracy and the amazing, much-studied "economic miracle". This is still the Japan of today: developed, democratic, and peaceful. The factors, internal and external, that led the country from an ultranationalist war machine to a land of passivity and high-tech exports are as numerous as they are impossibly complicated. But the moment, 67 years ago today, when Hirohito's near-falsetto came over the airwaves and commanded Japanese to "endure the unendurable" are a central inflection point in the Japanese death and rebirth that played such a major role in the 20th century.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Without the financial support that many white families can provide, minority young people have to continually make sacrifices that set them back.
He died on a Saturday.
My mother and I had planned to pick my dad up from the hospital for a trip to the park. He loved to sit and watch families stroll by as we chatted about oak trees, Kona coffee, and the mysteries of God. This time, the park would miss him.
His skin, smooth and brown like the outside of an avocado seed, glistened with sweat as he struggled to take his last breaths.
In that next year, I graduated from grad school, got a new job, and looked forward to saving for a down payment on my first home, a dream I had always had, but found lofty. I pulled up a blank spreadsheet and made a line item called “House Fund.”
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.
The sport is becoming an enterprise where underprivileged young men risk their health for the financial benefit of the wealthy.
Football can be a force for good. The University of Missouri’s football team proved it earlier this month when student athletes took a facet of campus life that’s often decried—the cultural and economic dominance of college football—and turned it into a powerful leverage point in the pursuit of social justice. Football can build a sense of community for players and fans alike, and serve as a welcome escape from the pressures of ordinary life. The sport cuts across distinctions of race, class, geography, and religion in a way few other U.S. institutions do, and everyone who participates reaps the benefits.
But not everyone—particularly at the amateur level—takes on an equal share of the risk. College football in particular seems headed toward a future in which it’s consumed by people born into privilege while the sport consumes people born without it. In a 2010 piece in The Awl, Cord Jefferson wrote, “Where some see the Super Bowl, I see young black men risking their bodies, minds, and futures for the joy and wealth of old white men.” This vision sounds dystopian but is quickly becoming an undeniable reality, given new statistics about how education affects awareness about brain-injury risk, as well as the racial makeup of Division I rosters and coaching staffs. The future of college football indeed looks a lot like what Jefferson called “glorified servitude,” and even as information comes to light about the dangers and injustices of football, nothing is currently being done to steer the sport away from that path.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Nuts-and-bolts Washington coverage has shifted to subscription-based publications, while the capitol’s traditional outlets have shrunk.
Back in 2009, I had a job with a Washington, D.C.-based newsletter called Water Policy Report. It wasn’t exactly a household name, but I was covering Congress, the federal courts, and the Environmental Protection Agency—a definite step up from the greased-pig-catching contests and crime-blotter stories I had chased at a community newspaper on Maryland’s Eastern Shore, my first job out of college.
One of my responsibilities at the newsletter was to check the Federal Register—the official portal that government agencies use to inform the public about regulatory actions. In December of that year I noticed an item that said that the Environmental Protection Agency had decided that existing pollution controls for offshore oil-drilling platforms in the Gulf of Mexico were adequate, and that there wasn’t enough pollution coming from those platforms to warrant further review or action.
“Wanting and not wanting the same thing at the same time is a baseline condition of human consciousness.”
Gary Noesner is a former FBI hostage negotiator. For part of the 51-day standoff outside the Branch Davidian religious compound in Waco, Texas, in 1993, he was the strategic coordinator for negotiations with the compound’s leader, David Koresh. This siege ended in infamous tragedy: The FBI launched a tear-gas attack on the compound, which burned to the ground, killing 76 people inside. But before Noesner was rotated out of his position as the siege’s head negotiator, he and his team secured the release of 35 people.
Jamie Holmes, a Future Tense Fellow at New America, spoke to Noesner for his new book Nonsense: The Power of Not Knowing. “My experience suggests,” Noesner told Holmes, “that in the overwhelming majority of these cases, people are confused and ambivalent. Part of them wants to die, part of them wants to live. Part of them wants to surrender, part of them doesn’t want to surrender.” And good negotiators, Noesner says, are “people who can dwell fairly effectively in the areas of gray, in the uncertainties and ambiguities of life.”
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”