A copy of The New York Times published May 8, 1945, bearing Kennedy's scoop (AP/Rick Bowmer)
On May 7, 1945, Associated Press Paris bureau chief Ed Kennedy set off one of the biggest journalism controversies of the 20th century. Nazi Germany had surrendered unconditionally to the Allies early that morning in a schoolhouse in Reims. Unbelievable as it may seem today, Supreme Allied Commander Gen. Dwight Eisenhower imposed a news blackout on the surrender, under orders from President Truman. Big official secrets were more keep-able then--but not always. Kennedy had access to an unauthorized phone line. Gambling his career, he used that line to break the surrender story. His exclusive, eyewitness account of the ceremony got huge news play and led to mass rejoicing in Paris, London, New York, and elsewhere.
For the gaunt, intense Kennedy, it became the scoop from Hell. Allied headquarters stripped away his press credentials, denounced him personally for defying the rules, and banished him to New York, where the AP fired him. Meanwhile, 54 rival reporters who had abided by the news embargo signed a statement branding Kennedy a double-crosser. The label lingered. In 1960, Walter Cronkite of CBS, a former United Press war correspondent, refused to stand when Kennedy offered his hand, according to a journalist who witnessed the encounter.
Kennedy tried for years to repair his damaged reputation, publishing a lengthy self-defense in the Atlantic ("I'd Do It Again," August 1948.) Among other points, Kennedy argued that Ike had not ordered the blackout for legitimate reasons of military security. He had done so for political reasons that did not justify censorship. Soviet dictator Josef Stalin wanted to stage a second surrender ceremony in Berlin - to sell the illusion that the Nazis had surrendered first to the Soviets. He did not want his propaganda ceremony overshadowed by news of the authentic surrender in Reims. Eisenhower's news blackout was intended to appease an increasingly truculent and distrustful ally. Documents in the National Archives bear this out. But the Atlantic article did not put Kennedy's career back on track. This former star of international journalism spent the rest of his life in small-town-newspaper exile, brooding and embittered. He died in Monterey, CA in 1963 after stepping from a bar into the path of an oncoming sedan.
Now scroll ahead 50 years to the present day: Ed Kennedy has been nominated for a Pulitzer Prize, honoring him in death for the decision that undid him in life. Dozens of journalists have joined the cause, petitioning the Pulitzer board on Kennedy's behalf. Kennedy's rehabilitation began last year with publication of memoirs that had sat for years in a box in his daughter's attic. After being invited to write the forward to Ed Kennedy's War, Associated Press CEO Tom Curley was moved to issue a public apology for Kennedy's firing. Publicity from that apology inspired the Pulitzer drive, and the board is set to announce this year's winners on April 15.
The Pulitzer board has bestowed posthumous awards in the past, all in music, so Kennedy appears eligible by precedent. Does he deserve this recognition? To answer that, one must first address a threshold question: Was breaking the news embargo ethically justifiable? War Department officials, and the journalists that Kennedy "scooped," said emphatically that it was not, and some recent commentators agree. Their case falls apart under scrutiny:
Kennedy broke his word when he broke embargo. Actually, it was Eisenhower's command that broke the embargo. As Ike's Chief of Staff, Gen. Walter Bedell Smith acknowledged after the war, the Allies ordered German radio to broadcast news of the surrender repeatedly to ensure that German forces stood down. The Germans complied, and Kennedy filed his story only after learning of the German broadcasts. He told the senior censor that these broadcasts had nullified the embargo, and he was no longer pledged to honor it.
Kennedy failed to inform his bosses that his dispatch broke the embargo. This is true, but he faced a Hobson's choice. Kennedy had dictated his story to the AP London bureau by phone. The London bureau then had to relay the story to New York headquarters for editing, using a trans-Atlantic cable minded by a military censor. Kennedy thus had two options. He could dictate the story without a warning to editors that it was unauthorized, and get it into print. Or he could include a note telling editors that he was breaking the embargo, ensuring that the censor could stop the dispatch. Kennedy made a difficult choice, but not a deceitful one.
Kennedy betrayed his fellow correspondents by failing to inform them of his intentions in advance. Come on. Wire reporters are paid to be first. If he had stopped to confer with his nearly 60 rivals, not only would he have lost the scoop, but also might inadvertently have alerted the authorities, making it impossible for anyone to file the story.
The most important point, though, is that more than a scoop was at stake with this story. Human lives were in the balance as well. On average, about 60 Americans were dying per day as the war in Europe wound down, and countless others, according to histories of the conflict. So Kennedy's report that the war was over might well have saved some lives, while bringing relief to millions of families of service members. Kennedy's story also revealed the diplomatic subtext described above. To give Stalin time to set up his propaganda surrender ceremony, President Truman had risked an increased death toll by keeping the war on officially for another day or two. Stalin's "Berlin surrender" version took root in the Soviet Union, where Victory Day is celebrated on May 9. Thanks in part to Ed Kennedy, however, VE Day in the West commemorates the real surrender in Reims.
The Ed Kennedy controversy became a huge story in the United States following Germany's surrender. Editorial writers and members of the public came to his defense, incensed that their own government would bottle up the best news of the war. In the face of this bad publicity, Army Chief of Staff General George Marshall ordered Eisenhower to go after Kennedy, according to documents in the National Archives. Ike's public relations chief held a press conference castigating the reporter for violating security. Meanwhile, government and other pressure led Associated Press President Robert McLean to apologize publicly for Kennedy's conduct before all the facts were in. Kennedy's AP career was over.
The Pulitzer board has awarded special citations recognizing a journalists' body of work, not merely an article or series. The case for a Kennedy Pulitzer is stronger if one takes into account his entire career as a war correspondent, starting with the Spanish Civil War in 1937 and continuing through desperate battles in North Africa and Crete, the beachhead at Anzio, Italy, and the allied invasion of Southern France. Through it all, he butted heads continually with Army censors and PR officers who sought to keep journalists under tight control. In September 1944, Kennedy took a jeep and broke away from headquarters, driving from southern France toward Paris, eluding retreating German units, mapping areas that had fallen under control of the Resistance, and documenting a Nazi massacre of men, women, and children. He arrived in Paris only to have his credentials suspended for traveling without permission. Eric Sevareid, who covered the war for CBS, described Kennedy in his 1953 memoir as "one of the most rigidly honest, most unflaggingly objective journalists, who never ceased in his efforts to free the news . . . He did more to hold the military to the letter of the censorship rules . . . than any other journalist I know."
Dealing with Army PR was a Kafkaesque experience then, as it can be today. Eisenhower said in his farewell press conference for war correspondents in Europe that there had been no serious censorship of their copy. In that same press conference, he reminded them to clear any statement they wanted to quote with a PR officer. He also said he regarded journalists accredited to his command as "auxiliary staff officers" whose job was to support the war effort through "objective" reporting. In reality, of course, one can't be both a quasi-soldier and an independent reporter. Ed Kennedy chose to be the latter, and it very nearly destroyed him. Even 50 years after his death, awarding a prize to Kennedy might convey a useful message following the recent decade of war: We need more Ed Kennedys and fewer "auxiliary staff officers" in the press.
Christopher Hanson, a professor at the University of Maryland’s Merrill College of journalism and long-time contributor to Columbia Journalism Review, competed with the Associated Press for eight years as a Reuters correspondent in Washington and London. He covered the Pentagon and was a combat correspondent in the Gulf War.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
“If the office is going to become a collection of employees not working together, it essentially becomes no different than a coffee shop.”
There’s plenty of research out there on the benefits of remote and flexible work. It’s been shown to lead to increased productivity, and has an undeniable benefit for work-life balance. But what does it do to everyone back at the office?
In a 2013 memo to workers explaining why the company was eliminating policies that allowed remote work, Jackie Reses, Yahoo’s head of human resources,argued that some of the “best decisions and insights come from hallway and cafeteria discussion,” and that actual presence in the office encourages better collaboration and communication.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 16, 2015.
National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 16, 2015. The Grand Prize Winner will receive $10,000 and a trip to National Geographic headquarters to participate in its annual photography seminar. The kind folks at National Geographic were once again kind enough to let me choose among the contest entries so far for display here. Captions written by the individual photographers.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.