A copy of The New York Times published May 8, 1945, bearing Kennedy's scoop (AP/Rick Bowmer)
On May 7, 1945, Associated Press Paris bureau chief Ed Kennedy set off one of the biggest journalism controversies of the 20th century. Nazi Germany had surrendered unconditionally to the Allies early that morning in a schoolhouse in Reims. Unbelievable as it may seem today, Supreme Allied Commander Gen. Dwight Eisenhower imposed a news blackout on the surrender, under orders from President Truman. Big official secrets were more keep-able then--but not always. Kennedy had access to an unauthorized phone line. Gambling his career, he used that line to break the surrender story. His exclusive, eyewitness account of the ceremony got huge news play and led to mass rejoicing in Paris, London, New York, and elsewhere.
For the gaunt, intense Kennedy, it became the scoop from Hell. Allied headquarters stripped away his press credentials, denounced him personally for defying the rules, and banished him to New York, where the AP fired him. Meanwhile, 54 rival reporters who had abided by the news embargo signed a statement branding Kennedy a double-crosser. The label lingered. In 1960, Walter Cronkite of CBS, a former United Press war correspondent, refused to stand when Kennedy offered his hand, according to a journalist who witnessed the encounter.
Kennedy tried for years to repair his damaged reputation, publishing a lengthy self-defense in the Atlantic ("I'd Do It Again," August 1948.) Among other points, Kennedy argued that Ike had not ordered the blackout for legitimate reasons of military security. He had done so for political reasons that did not justify censorship. Soviet dictator Josef Stalin wanted to stage a second surrender ceremony in Berlin - to sell the illusion that the Nazis had surrendered first to the Soviets. He did not want his propaganda ceremony overshadowed by news of the authentic surrender in Reims. Eisenhower's news blackout was intended to appease an increasingly truculent and distrustful ally. Documents in the National Archives bear this out. But the Atlantic article did not put Kennedy's career back on track. This former star of international journalism spent the rest of his life in small-town-newspaper exile, brooding and embittered. He died in Monterey, CA in 1963 after stepping from a bar into the path of an oncoming sedan.
Now scroll ahead 50 years to the present day: Ed Kennedy has been nominated for a Pulitzer Prize, honoring him in death for the decision that undid him in life. Dozens of journalists have joined the cause, petitioning the Pulitzer board on Kennedy's behalf. Kennedy's rehabilitation began last year with publication of memoirs that had sat for years in a box in his daughter's attic. After being invited to write the forward to Ed Kennedy's War, Associated Press CEO Tom Curley was moved to issue a public apology for Kennedy's firing. Publicity from that apology inspired the Pulitzer drive, and the board is set to announce this year's winners on April 15.
The Pulitzer board has bestowed posthumous awards in the past, all in music, so Kennedy appears eligible by precedent. Does he deserve this recognition? To answer that, one must first address a threshold question: Was breaking the news embargo ethically justifiable? War Department officials, and the journalists that Kennedy "scooped," said emphatically that it was not, and some recent commentators agree. Their case falls apart under scrutiny:
Kennedy broke his word when he broke embargo. Actually, it was Eisenhower's command that broke the embargo. As Ike's Chief of Staff, Gen. Walter Bedell Smith acknowledged after the war, the Allies ordered German radio to broadcast news of the surrender repeatedly to ensure that German forces stood down. The Germans complied, and Kennedy filed his story only after learning of the German broadcasts. He told the senior censor that these broadcasts had nullified the embargo, and he was no longer pledged to honor it.
Kennedy failed to inform his bosses that his dispatch broke the embargo. This is true, but he faced a Hobson's choice. Kennedy had dictated his story to the AP London bureau by phone. The London bureau then had to relay the story to New York headquarters for editing, using a trans-Atlantic cable minded by a military censor. Kennedy thus had two options. He could dictate the story without a warning to editors that it was unauthorized, and get it into print. Or he could include a note telling editors that he was breaking the embargo, ensuring that the censor could stop the dispatch. Kennedy made a difficult choice, but not a deceitful one.
Kennedy betrayed his fellow correspondents by failing to inform them of his intentions in advance. Come on. Wire reporters are paid to be first. If he had stopped to confer with his nearly 60 rivals, not only would he have lost the scoop, but also might inadvertently have alerted the authorities, making it impossible for anyone to file the story.
The most important point, though, is that more than a scoop was at stake with this story. Human lives were in the balance as well. On average, about 60 Americans were dying per day as the war in Europe wound down, and countless others, according to histories of the conflict. So Kennedy's report that the war was over might well have saved some lives, while bringing relief to millions of families of service members. Kennedy's story also revealed the diplomatic subtext described above. To give Stalin time to set up his propaganda surrender ceremony, President Truman had risked an increased death toll by keeping the war on officially for another day or two. Stalin's "Berlin surrender" version took root in the Soviet Union, where Victory Day is celebrated on May 9. Thanks in part to Ed Kennedy, however, VE Day in the West commemorates the real surrender in Reims.
The Ed Kennedy controversy became a huge story in the United States following Germany's surrender. Editorial writers and members of the public came to his defense, incensed that their own government would bottle up the best news of the war. In the face of this bad publicity, Army Chief of Staff General George Marshall ordered Eisenhower to go after Kennedy, according to documents in the National Archives. Ike's public relations chief held a press conference castigating the reporter for violating security. Meanwhile, government and other pressure led Associated Press President Robert McLean to apologize publicly for Kennedy's conduct before all the facts were in. Kennedy's AP career was over.
The Pulitzer board has awarded special citations recognizing a journalists' body of work, not merely an article or series. The case for a Kennedy Pulitzer is stronger if one takes into account his entire career as a war correspondent, starting with the Spanish Civil War in 1937 and continuing through desperate battles in North Africa and Crete, the beachhead at Anzio, Italy, and the allied invasion of Southern France. Through it all, he butted heads continually with Army censors and PR officers who sought to keep journalists under tight control. In September 1944, Kennedy took a jeep and broke away from headquarters, driving from southern France toward Paris, eluding retreating German units, mapping areas that had fallen under control of the Resistance, and documenting a Nazi massacre of men, women, and children. He arrived in Paris only to have his credentials suspended for traveling without permission. Eric Sevareid, who covered the war for CBS, described Kennedy in his 1953 memoir as "one of the most rigidly honest, most unflaggingly objective journalists, who never ceased in his efforts to free the news . . . He did more to hold the military to the letter of the censorship rules . . . than any other journalist I know."
Dealing with Army PR was a Kafkaesque experience then, as it can be today. Eisenhower said in his farewell press conference for war correspondents in Europe that there had been no serious censorship of their copy. In that same press conference, he reminded them to clear any statement they wanted to quote with a PR officer. He also said he regarded journalists accredited to his command as "auxiliary staff officers" whose job was to support the war effort through "objective" reporting. In reality, of course, one can't be both a quasi-soldier and an independent reporter. Ed Kennedy chose to be the latter, and it very nearly destroyed him. Even 50 years after his death, awarding a prize to Kennedy might convey a useful message following the recent decade of war: We need more Ed Kennedys and fewer "auxiliary staff officers" in the press.
Christopher Hanson, a professor at the University of Maryland’s Merrill College of journalism and long-time contributor to Columbia Journalism Review, competed with the Associated Press for eight years as a Reuters correspondent in Washington and London. He covered the Pentagon and was a combat correspondent in the Gulf War.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
The city is riding high after the NBA final. But with the GOP convention looming, residents are bracing for disappointment.
Cleveland’s in a weird mood.
My son and I attended the Indians game on Father’s Day, the afternoon before game seven of the NBA Finals—which, in retrospect, now seems like it should be blockbustered simply as The Afternoon Before—when the Cavaliers would take on the Golden State Warriors and bring the city its first major-league sports championship in 52 years.
I am 52 years old. I’ve lived in Northeast Ohio all my life. I know what Cleveland feels like. And it’s not this.
In the ballpark that day, 25,269 of us sat watching a pitcher’s duel, and the place was palpably subdued. The announcer and digitized big-screen signage made no acknowledgement of the city’s excitement over the Cavaliers. There were no chants of “Let’s Go Cavs,” no special seventh-inning-stretch cheer for the Indians’ basketball brothers, who play next door in the Quicken Loans Arena, which in a few weeks will host the Republican National Convention.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The Republican candidate is deeply unpopular, and his Democratic rival is promoting her own version of American nationalism.
American commentators have spent the weekend pondering the similarities between Britain’s vote to leave the European Union and America’s impending vote on whether to take leave it of its senses by electing Donald Trump. The similarities have been well-rehearsed: The supporters of Brexit—like the supporters of Trump--are older, non-college educated, non-urban, distrustful of elites, xenophobic, and nostalgic. Moreover, many British commentators discounted polls showing that Brexit might win just as many American commentators, myself very much included, discounted polls showing that Trump might win the Republican nomination. Brexit may even result in the installation this fall of a new British prime minister, Boris Johnson, who is entertaining, self-promoting, vaguely racist, doughy, and orange. It’s all too familiar.
Why professors, librarians, and politicians are shunning liberal arts in the name of STEM
I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.