A copy of The New York Times published May 8, 1945, bearing Kennedy's scoop (AP/Rick Bowmer)
On May 7, 1945, Associated Press Paris bureau chief Ed Kennedy set off one of the biggest journalism controversies of the 20th century. Nazi Germany had surrendered unconditionally to the Allies early that morning in a schoolhouse in Reims. Unbelievable as it may seem today, Supreme Allied Commander Gen. Dwight Eisenhower imposed a news blackout on the surrender, under orders from President Truman. Big official secrets were more keep-able then--but not always. Kennedy had access to an unauthorized phone line. Gambling his career, he used that line to break the surrender story. His exclusive, eyewitness account of the ceremony got huge news play and led to mass rejoicing in Paris, London, New York, and elsewhere.
For the gaunt, intense Kennedy, it became the scoop from Hell. Allied headquarters stripped away his press credentials, denounced him personally for defying the rules, and banished him to New York, where the AP fired him. Meanwhile, 54 rival reporters who had abided by the news embargo signed a statement branding Kennedy a double-crosser. The label lingered. In 1960, Walter Cronkite of CBS, a former United Press war correspondent, refused to stand when Kennedy offered his hand, according to a journalist who witnessed the encounter.
Kennedy tried for years to repair his damaged reputation, publishing a lengthy self-defense in the Atlantic ("I'd Do It Again," August 1948.) Among other points, Kennedy argued that Ike had not ordered the blackout for legitimate reasons of military security. He had done so for political reasons that did not justify censorship. Soviet dictator Josef Stalin wanted to stage a second surrender ceremony in Berlin - to sell the illusion that the Nazis had surrendered first to the Soviets. He did not want his propaganda ceremony overshadowed by news of the authentic surrender in Reims. Eisenhower's news blackout was intended to appease an increasingly truculent and distrustful ally. Documents in the National Archives bear this out. But the Atlantic article did not put Kennedy's career back on track. This former star of international journalism spent the rest of his life in small-town-newspaper exile, brooding and embittered. He died in Monterey, CA in 1963 after stepping from a bar into the path of an oncoming sedan.
Now scroll ahead 50 years to the present day: Ed Kennedy has been nominated for a Pulitzer Prize, honoring him in death for the decision that undid him in life. Dozens of journalists have joined the cause, petitioning the Pulitzer board on Kennedy's behalf. Kennedy's rehabilitation began last year with publication of memoirs that had sat for years in a box in his daughter's attic. After being invited to write the forward to Ed Kennedy's War, Associated Press CEO Tom Curley was moved to issue a public apology for Kennedy's firing. Publicity from that apology inspired the Pulitzer drive, and the board is set to announce this year's winners on April 15.
The Pulitzer board has bestowed posthumous awards in the past, all in music, so Kennedy appears eligible by precedent. Does he deserve this recognition? To answer that, one must first address a threshold question: Was breaking the news embargo ethically justifiable? War Department officials, and the journalists that Kennedy "scooped," said emphatically that it was not, and some recent commentators agree. Their case falls apart under scrutiny:
Kennedy broke his word when he broke embargo. Actually, it was Eisenhower's command that broke the embargo. As Ike's Chief of Staff, Gen. Walter Bedell Smith acknowledged after the war, the Allies ordered German radio to broadcast news of the surrender repeatedly to ensure that German forces stood down. The Germans complied, and Kennedy filed his story only after learning of the German broadcasts. He told the senior censor that these broadcasts had nullified the embargo, and he was no longer pledged to honor it.
Kennedy failed to inform his bosses that his dispatch broke the embargo. This is true, but he faced a Hobson's choice. Kennedy had dictated his story to the AP London bureau by phone. The London bureau then had to relay the story to New York headquarters for editing, using a trans-Atlantic cable minded by a military censor. Kennedy thus had two options. He could dictate the story without a warning to editors that it was unauthorized, and get it into print. Or he could include a note telling editors that he was breaking the embargo, ensuring that the censor could stop the dispatch. Kennedy made a difficult choice, but not a deceitful one.
Kennedy betrayed his fellow correspondents by failing to inform them of his intentions in advance. Come on. Wire reporters are paid to be first. If he had stopped to confer with his nearly 60 rivals, not only would he have lost the scoop, but also might inadvertently have alerted the authorities, making it impossible for anyone to file the story.
The most important point, though, is that more than a scoop was at stake with this story. Human lives were in the balance as well. On average, about 60 Americans were dying per day as the war in Europe wound down, and countless others, according to histories of the conflict. So Kennedy's report that the war was over might well have saved some lives, while bringing relief to millions of families of service members. Kennedy's story also revealed the diplomatic subtext described above. To give Stalin time to set up his propaganda surrender ceremony, President Truman had risked an increased death toll by keeping the war on officially for another day or two. Stalin's "Berlin surrender" version took root in the Soviet Union, where Victory Day is celebrated on May 9. Thanks in part to Ed Kennedy, however, VE Day in the West commemorates the real surrender in Reims.
The Ed Kennedy controversy became a huge story in the United States following Germany's surrender. Editorial writers and members of the public came to his defense, incensed that their own government would bottle up the best news of the war. In the face of this bad publicity, Army Chief of Staff General George Marshall ordered Eisenhower to go after Kennedy, according to documents in the National Archives. Ike's public relations chief held a press conference castigating the reporter for violating security. Meanwhile, government and other pressure led Associated Press President Robert McLean to apologize publicly for Kennedy's conduct before all the facts were in. Kennedy's AP career was over.
The Pulitzer board has awarded special citations recognizing a journalists' body of work, not merely an article or series. The case for a Kennedy Pulitzer is stronger if one takes into account his entire career as a war correspondent, starting with the Spanish Civil War in 1937 and continuing through desperate battles in North Africa and Crete, the beachhead at Anzio, Italy, and the allied invasion of Southern France. Through it all, he butted heads continually with Army censors and PR officers who sought to keep journalists under tight control. In September 1944, Kennedy took a jeep and broke away from headquarters, driving from southern France toward Paris, eluding retreating German units, mapping areas that had fallen under control of the Resistance, and documenting a Nazi massacre of men, women, and children. He arrived in Paris only to have his credentials suspended for traveling without permission. Eric Sevareid, who covered the war for CBS, described Kennedy in his 1953 memoir as "one of the most rigidly honest, most unflaggingly objective journalists, who never ceased in his efforts to free the news . . . He did more to hold the military to the letter of the censorship rules . . . than any other journalist I know."
Dealing with Army PR was a Kafkaesque experience then, as it can be today. Eisenhower said in his farewell press conference for war correspondents in Europe that there had been no serious censorship of their copy. In that same press conference, he reminded them to clear any statement they wanted to quote with a PR officer. He also said he regarded journalists accredited to his command as "auxiliary staff officers" whose job was to support the war effort through "objective" reporting. In reality, of course, one can't be both a quasi-soldier and an independent reporter. Ed Kennedy chose to be the latter, and it very nearly destroyed him. Even 50 years after his death, awarding a prize to Kennedy might convey a useful message following the recent decade of war: We need more Ed Kennedys and fewer "auxiliary staff officers" in the press.
Christopher Hanson, a professor at the University of Maryland’s Merrill College of journalism and long-time contributor to Columbia Journalism Review, competed with the Associated Press for eight years as a Reuters correspondent in Washington and London. He covered the Pentagon and was a combat correspondent in the Gulf War.
Angela Merkel has served formal notice that she will lead the German wandering away from the American alliance.
Seven years after the end of the Second World War, on the 10th of March 1952, the governments of the United States, the United Kingdom, France, and the newly established Federal Republic of Germany received an astounding note from the Soviet Union.
The Soviet Union offered to withdraw the troops that then occupied eastern Germany and to end its rule over the occupied zone. Germany would be reunited under a constitution that allowed the country freedom to choose its own social system. Germany would even be allowed to rebuild its military, and all Germans except those convicted of war crimes would regain their political rights. In return, the Allied troops in western Germany would also be withdrawn—and reunited Germany would be forbidden to join the new NATO alliance.
As Republicans in Congress try to fend off the flurry of scandals, they are haunted by a question: Is this as good as it’s going to get?
The speaker of the House strode to his lectern on a recent Thursday to confront another totally normal day on Capitol Hill: health care, tax reform, a president under investigation, rumblings of impeachment.
“Morning, everybody!” Paul Ryan chirped. “Busy week!”
It was indeed: Less than a day had passed since the appointment of a special prosecutor to investigate Russia’s involvement in the presidential campaign; just a few hours since President Trump angrily tweeted that the investigation was “the single greatest witch hunt of a politician in American history!”; and only minutes since the Russia-linked former national-security adviser, Michael Flynn, had begun defying congressional subpoenas. A few days prior, the president had been accused of revealing sensitive intelligence information to the Russian foreign minister.
Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.
During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.
The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
What's the healthiest way to keep everyone caffeinated?
“I don't have one. They're kind of expensive to use,” John Sylvan told me frankly, of Keurig K-Cups, the single-serve brewing pods that have fundamentally changed the coffee experience in recent years. “Plus it’s not like drip coffee is tough to make.” Which would seem like a pretty banal sentiment, were Sylvan not the inventor of the K-Cup.
Almost one in three American homes now has a pod-based coffee machine, even though Sylvan never imagined they would be used outside of offices. Last year K-Cups accounted for most of Keurig Green Mountain’s $4.7 billion in revenue—more than five times what the company made five years prior. So even though he gets treated like a minor celebrity when he tells people he founded Keurig, Sylvan has some regrets about selling his share of the company in 1997 for $50,000. But that’s not what really upsets him.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
Colleges are adjusting to increasing contact with adults who are more ingrained in their children’s lives than ever.
Stacy G.’s daughter was having a meltdown. Her daughter, a sophomore at a prestigious private college, wanted an internship at Boston Children’s Hospital, a plum job that would look great on her applications to graduate school. After four weeks of frantically waiting for the school to arrange for an interview at the hospital, Stacy called her daughter’s adviser at the internships office to complain.
“For $65,000 [in full attendance costs], you can bet your sweet ass that I’m calling that school ... If your children aren’t getting what they’ve been promised, colleges are going to get that phone call from parents,” Stacy said. “It’s my money. It’s a lot of money. We did try to have her handle it on her own, but when it didn’t work out, I called them.”
The increasingly illiberal European country offers shelter to a growing number of international nationalists.
In February 2017, at the state of the nation address, Viktor Orbán, the prime minister of Hungary and the leader of the far-right, anti-immigrant Fidesz party, offered his vision for the country in the coming year. “We shall let in true refugees: Germans, Dutch, French, and Italians, terrified politicians and journalists who here in Hungary want to find the Europe they have lost in their homelands,” he proclaimed.
In reality, Orbán’s “refugees” have been moving to Hungary, and Budapest in particular, for years. A small clique of Identitarians, or aggrieved nationalists from Sweden, the United Kingdom, the United States, France, and elsewhere, all motivated by their disdain for their home countries’ commitment to liberal values, have found an ideological match in his Hungary, where two extreme far-right parties, the governing Fidesz and Jobbik, the largest opposition party, make up most of the National Assembly. Jobbik is the first European political party to champion a border wall. Its members frequently express open anti-Semitic and anti-Roma sentiments, and prioritize the preservation of “Hungary for the Hungarians.”
The permissiveness of Republican leaders who acquiesce to violence, collusion, and corruption is encouraging more of the same.
In the annals of the Trump era, May 25, 2017, will deserve a special mark. Four remarkable things happened on Thursday, each of which marks a way that this presidency is changing the nation.
The first remarkable thing was President Trump’s speech at the NATO summit in Brussels. Many European governments had hoped—which is a polite way to say that they had suggested and expected—that Trump would reaffirm the American commitment to defend NATO members if attacked. This is the point of the whole enterprise after all! Here’s how it was done by President Obama at the NATO summit after the Russian invasion of Crimea:
First and foremost, we have reaffirmed the central mission of the Alliance. Article 5 enshrines our solemn duty to each other—“an armed attack against one … shall be considered an attack against them all.” This is a binding, treaty obligation. It is non-negotiable. And here in Wales, we’ve left absolutely no doubt—we will defend every Ally.
A century and a half after the Civil War, Mayor Mitch Landrieu asked his city to reexamine its past—and to wrestle with hard truths.
Mayor Mitch Landrieu of New Orleans has revived the genre of Memorial Day orations. In his widely read and re-played speech of May 19, 2017, defending his leadership of the removal of four prominent public monuments, one to Reconstruction era white supremacist violence, and the other three to Confederate leaders, Robert E. Lee, Jefferson Davis, and P. G. T. Beauregard, Landrieu eloquently tried to pull the Confederacy once and for all – at least in New Orleans – down from its pedestals. He beautifully labeled his city “a bubbling cauldron of many cultures,” expressing its ancient roots in many Native American peoples; in at least two European empires; in African, Irish, Italian, French, and many other ethnic lineages; and of course in cuisine, jazz and “second lines.” New Orleans, he said, is a city made by all the nations of the world, but one great “gumbo” made from many. The speech was as deeply patriotic as it was also deeply political—“e pluribus unum” carries a weight right now in Trump’s America that makes most politicians shy from such fulsome embraces of pluralism and brutally honest historical consciousness. Indeed, any historical consciousness, save for toxic forms of nostalgia, is out of style among Trump’s supporters as well as his cowed, silent enablers in the Republican Party.