A copy of The New York Times published May 8, 1945, bearing Kennedy's scoop (AP/Rick Bowmer)
On May 7, 1945, Associated Press Paris bureau chief Ed Kennedy set off one of the biggest journalism controversies of the 20th century. Nazi Germany had surrendered unconditionally to the Allies early that morning in a schoolhouse in Reims. Unbelievable as it may seem today, Supreme Allied Commander Gen. Dwight Eisenhower imposed a news blackout on the surrender, under orders from President Truman. Big official secrets were more keep-able then--but not always. Kennedy had access to an unauthorized phone line. Gambling his career, he used that line to break the surrender story. His exclusive, eyewitness account of the ceremony got huge news play and led to mass rejoicing in Paris, London, New York, and elsewhere.
For the gaunt, intense Kennedy, it became the scoop from Hell. Allied headquarters stripped away his press credentials, denounced him personally for defying the rules, and banished him to New York, where the AP fired him. Meanwhile, 54 rival reporters who had abided by the news embargo signed a statement branding Kennedy a double-crosser. The label lingered. In 1960, Walter Cronkite of CBS, a former United Press war correspondent, refused to stand when Kennedy offered his hand, according to a journalist who witnessed the encounter.
Kennedy tried for years to repair his damaged reputation, publishing a lengthy self-defense in the Atlantic ("I'd Do It Again," August 1948.) Among other points, Kennedy argued that Ike had not ordered the blackout for legitimate reasons of military security. He had done so for political reasons that did not justify censorship. Soviet dictator Josef Stalin wanted to stage a second surrender ceremony in Berlin - to sell the illusion that the Nazis had surrendered first to the Soviets. He did not want his propaganda ceremony overshadowed by news of the authentic surrender in Reims. Eisenhower's news blackout was intended to appease an increasingly truculent and distrustful ally. Documents in the National Archives bear this out. But the Atlantic article did not put Kennedy's career back on track. This former star of international journalism spent the rest of his life in small-town-newspaper exile, brooding and embittered. He died in Monterey, CA in 1963 after stepping from a bar into the path of an oncoming sedan.
Now scroll ahead 50 years to the present day: Ed Kennedy has been nominated for a Pulitzer Prize, honoring him in death for the decision that undid him in life. Dozens of journalists have joined the cause, petitioning the Pulitzer board on Kennedy's behalf. Kennedy's rehabilitation began last year with publication of memoirs that had sat for years in a box in his daughter's attic. After being invited to write the forward to Ed Kennedy's War, Associated Press CEO Tom Curley was moved to issue a public apology for Kennedy's firing. Publicity from that apology inspired the Pulitzer drive, and the board is set to announce this year's winners on April 15.
The Pulitzer board has bestowed posthumous awards in the past, all in music, so Kennedy appears eligible by precedent. Does he deserve this recognition? To answer that, one must first address a threshold question: Was breaking the news embargo ethically justifiable? War Department officials, and the journalists that Kennedy "scooped," said emphatically that it was not, and some recent commentators agree. Their case falls apart under scrutiny:
Kennedy broke his word when he broke embargo. Actually, it was Eisenhower's command that broke the embargo. As Ike's Chief of Staff, Gen. Walter Bedell Smith acknowledged after the war, the Allies ordered German radio to broadcast news of the surrender repeatedly to ensure that German forces stood down. The Germans complied, and Kennedy filed his story only after learning of the German broadcasts. He told the senior censor that these broadcasts had nullified the embargo, and he was no longer pledged to honor it.
Kennedy failed to inform his bosses that his dispatch broke the embargo. This is true, but he faced a Hobson's choice. Kennedy had dictated his story to the AP London bureau by phone. The London bureau then had to relay the story to New York headquarters for editing, using a trans-Atlantic cable minded by a military censor. Kennedy thus had two options. He could dictate the story without a warning to editors that it was unauthorized, and get it into print. Or he could include a note telling editors that he was breaking the embargo, ensuring that the censor could stop the dispatch. Kennedy made a difficult choice, but not a deceitful one.
Kennedy betrayed his fellow correspondents by failing to inform them of his intentions in advance. Come on. Wire reporters are paid to be first. If he had stopped to confer with his nearly 60 rivals, not only would he have lost the scoop, but also might inadvertently have alerted the authorities, making it impossible for anyone to file the story.
The most important point, though, is that more than a scoop was at stake with this story. Human lives were in the balance as well. On average, about 60 Americans were dying per day as the war in Europe wound down, and countless others, according to histories of the conflict. So Kennedy's report that the war was over might well have saved some lives, while bringing relief to millions of families of service members. Kennedy's story also revealed the diplomatic subtext described above. To give Stalin time to set up his propaganda surrender ceremony, President Truman had risked an increased death toll by keeping the war on officially for another day or two. Stalin's "Berlin surrender" version took root in the Soviet Union, where Victory Day is celebrated on May 9. Thanks in part to Ed Kennedy, however, VE Day in the West commemorates the real surrender in Reims.
The Ed Kennedy controversy became a huge story in the United States following Germany's surrender. Editorial writers and members of the public came to his defense, incensed that their own government would bottle up the best news of the war. In the face of this bad publicity, Army Chief of Staff General George Marshall ordered Eisenhower to go after Kennedy, according to documents in the National Archives. Ike's public relations chief held a press conference castigating the reporter for violating security. Meanwhile, government and other pressure led Associated Press President Robert McLean to apologize publicly for Kennedy's conduct before all the facts were in. Kennedy's AP career was over.
The Pulitzer board has awarded special citations recognizing a journalists' body of work, not merely an article or series. The case for a Kennedy Pulitzer is stronger if one takes into account his entire career as a war correspondent, starting with the Spanish Civil War in 1937 and continuing through desperate battles in North Africa and Crete, the beachhead at Anzio, Italy, and the allied invasion of Southern France. Through it all, he butted heads continually with Army censors and PR officers who sought to keep journalists under tight control. In September 1944, Kennedy took a jeep and broke away from headquarters, driving from southern France toward Paris, eluding retreating German units, mapping areas that had fallen under control of the Resistance, and documenting a Nazi massacre of men, women, and children. He arrived in Paris only to have his credentials suspended for traveling without permission. Eric Sevareid, who covered the war for CBS, described Kennedy in his 1953 memoir as "one of the most rigidly honest, most unflaggingly objective journalists, who never ceased in his efforts to free the news . . . He did more to hold the military to the letter of the censorship rules . . . than any other journalist I know."
Dealing with Army PR was a Kafkaesque experience then, as it can be today. Eisenhower said in his farewell press conference for war correspondents in Europe that there had been no serious censorship of their copy. In that same press conference, he reminded them to clear any statement they wanted to quote with a PR officer. He also said he regarded journalists accredited to his command as "auxiliary staff officers" whose job was to support the war effort through "objective" reporting. In reality, of course, one can't be both a quasi-soldier and an independent reporter. Ed Kennedy chose to be the latter, and it very nearly destroyed him. Even 50 years after his death, awarding a prize to Kennedy might convey a useful message following the recent decade of war: We need more Ed Kennedys and fewer "auxiliary staff officers" in the press.
Christopher Hanson, a professor at the University of Maryland’s Merrill College of journalism and long-time contributor to Columbia Journalism Review, competed with the Associated Press for eight years as a Reuters correspondent in Washington and London. He covered the Pentagon and was a combat correspondent in the Gulf War.
His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.
Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.
The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
As Oklahoma attorney general, Scott Pruitt sued the federal government to prevent rules about air and water pollution from taking effect.
Throughout the long campaign, and in the long month that has followed, President-elect Donald Trump sounded some odd notes about the environment.
He rejected the scientific fact of climate change, calling it a hoax or a fraud. He repeatedly announced his intent to repeal all of the Obama administration’s environmental regulations. He lamented, wrongly, that you couldn’t use hairspray anymore because it damaged the ozone layer.
And then, out of nowhere, he met with Al Gore, who won a Nobel Peace Prize for educating the public about the dangers of climate change.
While the broad strokes of Trump’s policies were never in doubt, there was often enough bizarreness to wonder what he would do with the powers of the Environmental Protection Agency.
As journalists push back against hoaxes and conspiracies, media skeptics are using charges of “fake news” against professionals.
For a term that is suddenly everywhere, “fake news” is fairly slippery.
Is “fake news” a reference to government propaganda designed to look like independent journalism? Or is it any old made-up bullshit that people share as real on the internet? Is “fake news” the appropriate label for a hoax meant to make a larger point? Does a falsehood only become “fake news” when it shows up on a platform like Facebook as legitimate news? What about conspiracy theorists who genuinely believe the outrageous lies they’re sharing? Or satire intended to entertain? And is it still “fake news” if we’re talking about a real news organization that unintentionally gets it wrong? (Also, what constitutes a real news organization anymore?)
Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?
This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.
At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Studies show that for most types of cognitively demanding tasks, anything but quiet hurts performance.
Like most modern “knowledge” workers, I spend my days in an open office. That means I also spend my days amid ringing phones, the inquisitive tones of co-workers conducting interviews, and—because we work in a somewhat old, infamous building—the pounding and drilling of seemingly endless renovations.
Even so, the #content must still be wrung from my distracted brain. And so, I join the characters of trend pieces everywhere in wearing headphones almost all day, every day. And what better to listen to with headphones than music? By now, I’ve worked my way through all the “Focus” playlists on Spotify—most of which sound like they were meant for a very old planetarium—and I’ve looped back around to a genre I like to call “soft, synthy pop songs whose lyrics don’t make much sense:” Think Miike Snow rather than Michael Jackson.
A stray observation helped one researcher to uncover the strange connection between the seashells and lobsters of his childhood.
Born in the Bahamas to a family of lobster fishermen, Nicholas Higgs spent much of his childhood diving in Caribbean waters, working on boats, and collecting shells on the beach. That connection to the sea stayed with him. He moved to the UK and became a marine biologist. He studied whales and marine worms. And on his wedding day, he asked his parents to bring some shells from the Bahamas to decorate the dining tables. Those shells, which symbolized his past, would also define his future.
At the wedding, his former boss picked one up and identified it as a lucinid clam—a group that feeds in a strange way. While most clams filter food from the surrounding water, lucinids get almost all their nourishment from bacteria that live in their gills. And the bacteria create their own food—just like plants, but with one critical difference. Plants make nutrients by harnessing the sun’s energy, in a process called photosynthesis. But the clam bacteria get their energy by processing minerals in their surroundings. That’s chemosynthesis—making nutrients with chemical power instead of solar power.
Americans are optimistic about the communities they live in—but not their nation. Why?
I have been alive for a long time. I remember the assassination of John F. Kennedy, when I was a 10th-grader, and then watching with my family through the grim following days as newscasters said that something had changed forever. The next dozen years were nearly nonstop trauma for the country. More assassinations. Riots in most major cities. All the pain and waste and tragedy of the Vietnam War, and then the public sense of heading into the utterly unknown as, for the first time ever, a president was forced to resign. Americans of my children’s generation can remember the modern wave of shocks and dislocations that started but did not end with the 9/11 attacks.
Through all this time, I have been personally and professionally, and increasingly, an American optimist. The long years I have spent living and working outside the United States have not simply made me more aware of my own strong identity as an American. They have also sharpened my appreciation for the practical ramifications of the American idea. For me this is the belief that through its cycle of struggle and renewal, the United States is in a continual process of becoming a better version of itself. What I have seen directly over the past decade, roughly half in China and much of the rest in reporting trips around the United States, has reinforced my sense that our current era has been another one of painful but remarkable reinvention, in which the United States is doing more than most other societies to position itself, despite technological and economic challenges, for a new era of prosperity, opportunity, and hope.