A copy of The New York Times published May 8, 1945, bearing Kennedy's scoop (AP/Rick Bowmer)
On May 7, 1945, Associated Press Paris bureau chief Ed Kennedy set off one of the biggest journalism controversies of the 20th century. Nazi Germany had surrendered unconditionally to the Allies early that morning in a schoolhouse in Reims. Unbelievable as it may seem today, Supreme Allied Commander Gen. Dwight Eisenhower imposed a news blackout on the surrender, under orders from President Truman. Big official secrets were more keep-able then--but not always. Kennedy had access to an unauthorized phone line. Gambling his career, he used that line to break the surrender story. His exclusive, eyewitness account of the ceremony got huge news play and led to mass rejoicing in Paris, London, New York, and elsewhere.
For the gaunt, intense Kennedy, it became the scoop from Hell. Allied headquarters stripped away his press credentials, denounced him personally for defying the rules, and banished him to New York, where the AP fired him. Meanwhile, 54 rival reporters who had abided by the news embargo signed a statement branding Kennedy a double-crosser. The label lingered. In 1960, Walter Cronkite of CBS, a former United Press war correspondent, refused to stand when Kennedy offered his hand, according to a journalist who witnessed the encounter.
Kennedy tried for years to repair his damaged reputation, publishing a lengthy self-defense in the Atlantic ("I'd Do It Again," August 1948.) Among other points, Kennedy argued that Ike had not ordered the blackout for legitimate reasons of military security. He had done so for political reasons that did not justify censorship. Soviet dictator Josef Stalin wanted to stage a second surrender ceremony in Berlin - to sell the illusion that the Nazis had surrendered first to the Soviets. He did not want his propaganda ceremony overshadowed by news of the authentic surrender in Reims. Eisenhower's news blackout was intended to appease an increasingly truculent and distrustful ally. Documents in the National Archives bear this out. But the Atlantic article did not put Kennedy's career back on track. This former star of international journalism spent the rest of his life in small-town-newspaper exile, brooding and embittered. He died in Monterey, CA in 1963 after stepping from a bar into the path of an oncoming sedan.
Now scroll ahead 50 years to the present day: Ed Kennedy has been nominated for a Pulitzer Prize, honoring him in death for the decision that undid him in life. Dozens of journalists have joined the cause, petitioning the Pulitzer board on Kennedy's behalf. Kennedy's rehabilitation began last year with publication of memoirs that had sat for years in a box in his daughter's attic. After being invited to write the forward to Ed Kennedy's War, Associated Press CEO Tom Curley was moved to issue a public apology for Kennedy's firing. Publicity from that apology inspired the Pulitzer drive, and the board is set to announce this year's winners on April 15.
The Pulitzer board has bestowed posthumous awards in the past, all in music, so Kennedy appears eligible by precedent. Does he deserve this recognition? To answer that, one must first address a threshold question: Was breaking the news embargo ethically justifiable? War Department officials, and the journalists that Kennedy "scooped," said emphatically that it was not, and some recent commentators agree. Their case falls apart under scrutiny:
Kennedy broke his word when he broke embargo. Actually, it was Eisenhower's command that broke the embargo. As Ike's Chief of Staff, Gen. Walter Bedell Smith acknowledged after the war, the Allies ordered German radio to broadcast news of the surrender repeatedly to ensure that German forces stood down. The Germans complied, and Kennedy filed his story only after learning of the German broadcasts. He told the senior censor that these broadcasts had nullified the embargo, and he was no longer pledged to honor it.
Kennedy failed to inform his bosses that his dispatch broke the embargo. This is true, but he faced a Hobson's choice. Kennedy had dictated his story to the AP London bureau by phone. The London bureau then had to relay the story to New York headquarters for editing, using a trans-Atlantic cable minded by a military censor. Kennedy thus had two options. He could dictate the story without a warning to editors that it was unauthorized, and get it into print. Or he could include a note telling editors that he was breaking the embargo, ensuring that the censor could stop the dispatch. Kennedy made a difficult choice, but not a deceitful one.
Kennedy betrayed his fellow correspondents by failing to inform them of his intentions in advance. Come on. Wire reporters are paid to be first. If he had stopped to confer with his nearly 60 rivals, not only would he have lost the scoop, but also might inadvertently have alerted the authorities, making it impossible for anyone to file the story.
The most important point, though, is that more than a scoop was at stake with this story. Human lives were in the balance as well. On average, about 60 Americans were dying per day as the war in Europe wound down, and countless others, according to histories of the conflict. So Kennedy's report that the war was over might well have saved some lives, while bringing relief to millions of families of service members. Kennedy's story also revealed the diplomatic subtext described above. To give Stalin time to set up his propaganda surrender ceremony, President Truman had risked an increased death toll by keeping the war on officially for another day or two. Stalin's "Berlin surrender" version took root in the Soviet Union, where Victory Day is celebrated on May 9. Thanks in part to Ed Kennedy, however, VE Day in the West commemorates the real surrender in Reims.
The Ed Kennedy controversy became a huge story in the United States following Germany's surrender. Editorial writers and members of the public came to his defense, incensed that their own government would bottle up the best news of the war. In the face of this bad publicity, Army Chief of Staff General George Marshall ordered Eisenhower to go after Kennedy, according to documents in the National Archives. Ike's public relations chief held a press conference castigating the reporter for violating security. Meanwhile, government and other pressure led Associated Press President Robert McLean to apologize publicly for Kennedy's conduct before all the facts were in. Kennedy's AP career was over.
The Pulitzer board has awarded special citations recognizing a journalists' body of work, not merely an article or series. The case for a Kennedy Pulitzer is stronger if one takes into account his entire career as a war correspondent, starting with the Spanish Civil War in 1937 and continuing through desperate battles in North Africa and Crete, the beachhead at Anzio, Italy, and the allied invasion of Southern France. Through it all, he butted heads continually with Army censors and PR officers who sought to keep journalists under tight control. In September 1944, Kennedy took a jeep and broke away from headquarters, driving from southern France toward Paris, eluding retreating German units, mapping areas that had fallen under control of the Resistance, and documenting a Nazi massacre of men, women, and children. He arrived in Paris only to have his credentials suspended for traveling without permission. Eric Sevareid, who covered the war for CBS, described Kennedy in his 1953 memoir as "one of the most rigidly honest, most unflaggingly objective journalists, who never ceased in his efforts to free the news . . . He did more to hold the military to the letter of the censorship rules . . . than any other journalist I know."
Dealing with Army PR was a Kafkaesque experience then, as it can be today. Eisenhower said in his farewell press conference for war correspondents in Europe that there had been no serious censorship of their copy. In that same press conference, he reminded them to clear any statement they wanted to quote with a PR officer. He also said he regarded journalists accredited to his command as "auxiliary staff officers" whose job was to support the war effort through "objective" reporting. In reality, of course, one can't be both a quasi-soldier and an independent reporter. Ed Kennedy chose to be the latter, and it very nearly destroyed him. Even 50 years after his death, awarding a prize to Kennedy might convey a useful message following the recent decade of war: We need more Ed Kennedys and fewer "auxiliary staff officers" in the press.
Christopher Hanson, a professor at the University of Maryland’s Merrill College of journalism and long-time contributor to Columbia Journalism Review, competed with the Associated Press for eight years as a Reuters correspondent in Washington and London. He covered the Pentagon and was a combat correspondent in the Gulf War.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
The sport is becoming an enterprise where underprivileged young men risk their health for the financial benefit of the wealthy.
Football can be a force for good. The University of Missouri’s football team proved it earlier this month when student athletes took a facet of campus life that’s often decried—the cultural and economic dominance of college football—and turned it into a powerful leverage point in the pursuit of social justice. Football can build a sense of community for players and fans alike, and serve as a welcome escape from the pressures of ordinary life. The sport cuts across distinctions of race, class, geography, and religion in a way few other U.S. institutions do, and everyone who participates reaps the benefits.
But not everyone—particularly at the amateur level—takes on an equal share of the risk. College football in particular seems headed toward a future in which it’s consumed by people born into privilege while the sport consumes people born without it. In a 2010 piece in The Awl, Cord Jefferson wrote, “Where some see the Super Bowl, I see young black men risking their bodies, minds, and futures for the joy and wealth of old white men.” This vision sounds dystopian but is quickly becoming an undeniable reality, given new statistics about how education affects awareness about brain-injury risk, as well as the racial makeup of Division I rosters and coaching staffs. The future of college football indeed looks a lot like what Jefferson called “glorified servitude,” and even as information comes to light about the dangers and injustices of football, nothing is currently being done to steer the sport away from that path.
Without the financial support that many white families can provide, minority young people have to continually make sacrifices that set them back.
He died on a Saturday.
My mother and I had planned to pick my dad up from the hospital for a trip to the park. He loved to sit and watch families stroll by as we chatted about oak trees, Kona coffee, and the mysteries of God. This time, the park would miss him.
His skin, smooth and brown like the outside of an avocado seed, glistened with sweat as he struggled to take his last breaths.
In that next year, I graduated from grad school, got a new job, and looked forward to saving for a down payment on my first home, a dream I had always had, but found lofty. I pulled up a blank spreadsheet and made a line item called “House Fund.”
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
Live in anticipation, gathering stories and memories. New research builds on the vogue mantra of behavioral economics.
Forty-seven percent of the time, the average mind is wandering. It wanders about a third of the time while a person is reading, talking with other people, or taking care of children. It wanders 10 percent of the time, even, during sex. And that wandering, according to psychologist Matthew Killingsworth, is not good for well-being. A mind belongs in one place. During his training at Harvard, Killingsworth compiled those numbers and built a scientific case for every cliché about living in the moment. In a 2010 Science paper co-authored with psychology professor Daniel Gilbert, the two wrote that "a wandering mind is an unhappy mind."
For Killingsworth, happiness is in the content of moment-to-moment experiences. Nothing material is intrinsically valuable, except in whatever promise of happiness it carries. Satisfaction in owning a thing does not have to come during the moment it's acquired, of course. It can come as anticipation or nostalgic longing. Overall, though, the achievement of the human brain to contemplate events past and future at great, tedious length has, these psychologists believe, come at the expense of happiness. Minds tend to wander to dark, not whimsical, places. Unless that mind has something exciting to anticipate or sweet to remember.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
Adults remember more of what they learned in school than they think they do—thanks to an aspect of education that doesn’t get much attention in policy debates.
I recently found a box of papers from high school and was shocked to see what I once knew. There, in my handwriting, was a multi-step geometric proof, a creditable essay on the United States’ involvement in the Philippine revolution, and other work that today is as incomprehensible to me as a Swedish newscast.
Chances are this is a common experience among adults like me who haven’t stepped foot in the classroom for ages—which might suggest there wasn’t much point in learning the stuff in the first place. But then again, maybe there is.
Research shows that people can often retain certain information long after they learned it in school. For example, in one 1998 study, 1,168 adults took an exam in developmental psychology, similar to the final exam they had taken for a college course between three and 16 years earlier. Yes, much had been forgotten, especially within the first three years of taking the course—but not everything. The study found that even after 16 years, participants had retained some knowledge from the college course, particularly facts (versus the application of mental skills). Psychologists in another psychology study, this one published in 1991, examined memory for high-school math content and had similar results.
You can't fake being calm and carrying on. Unless you can.
If you were still warm from the embrace of several beautiful young people when your copy of New York magazine arrived this week, you probably got a pretty good shock to hear that we Millennials are not, in fact, the hookup generation. I certainly was (wink emoticon). No, says writer Maureen O'Connor, we are "the breakup generation."
Even more specifically, we're the generation of public breakups and public post-breakup damage control. Facebook released data earlier this year that said when people switch their status from indicating any type of relationship to "single," they immediately swoop into a transient 225 percent increase in the volume of interactions on the site. In those days and weeks (months?) after a relationship ends, it's also true that the theatrics of our social-media caricatures bend toward an audience of one. "I am fine," says the Instagram, in fewer words but so many more. "I am doing fine. Can't you tell?"