Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
What would the American culture wars look like if they were less about “values” and more about Jesus?
Evangelical Christianity has long had a stranglehold on how Americans imagine public faith. Vague invocations of “religion”—whether it’s “religion vs. science” or “religious freedom”—usually really mean “conservative, Protestant, evangelical Christianity,” and this assumption inevitably frames debates about American belief. For the other three-quarters of the population—Catholics, Jews, other Protestants, Muslims, Hindus, secular Americans, Buddhists, Wiccans, etc.—this can be infuriating. For some evangelicals, it’s a sign of success, a linguistic triumph of the culture wars.
But not for Russell Moore. In 2013, the 43-year-old theologian became the head of the Ethics and Religious Liberty Commission, the political nerve center of the Southern Baptist Convention. His predecessor, Richard Land, prayed with George W. Bush, played hardball with Democrats, and helped make evangelicals a quintessentially Republican voting bloc.
Many psychiatrists believe that a new approach to diagnosing and treating depression—linking individual symptoms to their underlying mechanisms—is needed for research to move forward.
In his Aphorisms, Hippocrates defined melancholia, an early understanding of depression, as a state of “fears and despondencies, if they last a long time.” It was caused, he believed, by an excess of bile in the body (the word “melancholia” is ancient Greek for “black bile”).
Ever since then, doctors have struggled to create a more precise and accurate definition of the illness that still isn’t well understood. In the 1920s, the German psychiatrist Kurt Schneider argued that depression could be divided into two separate conditions, each requiring a different form of treatment: depression that resulted from changes in mood, which he called “endogenous depression,” and depression resulting from reactions to outside events, or “reactive depression.” His theory was challenged in 1926, when the British psychologist Edward Mapother argued in the British Medical Journal that there was no evidence for two distinct types of depression, and that the apparent differences between depression patients were just differences in the severity of the condition.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced. Winning first prize, Anuar Patjane Floriuk of Tehuacán, Mexico, will receive an eight-day photo expedition for two to Costa Rica and the Panama Canal for a photograph of divers swimming near a humpback whale off the western coast of Mexico. Here, National Geographic has shared all of this year’s winners, gathered from four categories: Travel Portraits, Outdoor Scenes, Sense of Place, and Spontaneous Moments. Captions by the photographers.
Exceptional nonfiction stories from 2014 that are still worth encountering today
Each year, I keep a running list of exceptional nonfiction that I encounter as I publish The Best ofJournalism, an email newsletter that I send out once or twice a week. This is my annual attempt to bring some of those stories to a wider audience. I could not read or note every worthy article that was published last calendar year and I haven't included any paywalled articles or anything published at The Atlantic. But everything that follows is worthy of wider attention and engagement.
Paul faced danger, Ani and Ray faced each other, and Frank faced some career decisions.
This is what happens when you devote two-thirds of a season to scene after scene after scene of Frank and Jordan’s Baby Problems, and Frank Shaking Guys Down, and Look How Fucked Up Ray and Ani Are, and Melancholy Singer in the Dive Bar Yet Again—and then you suddenly realize that with only a couple episodes left you haven’t offered even a rudimentary outline of the central plot.
What if Joe Biden is going to run for the Democratic nomination after all?
Most Democrats seem ready for Hillary Clinton—or at least appear content with her candidacy. But what about the ones who who were bidin’ for Biden? There are new signs the vice president might consider running for president after all.
Biden has given little indication he was exploring a run: There’s no super PAC, no cultivation of a network of fundraisers or grassroots organizers, few visits to early-primary states. While his boss hasn’t endorsed Clinton—and says he won’t endorse in the primary—many members of the Obama administration have gone to work for Clinton, including some close to Biden.
But Biden also hasn’t given any clear indication that he isn’t running, and a column by Maureen Dowd in Saturday’s New York Times has set off new speculation. One reason Biden didn’t get into the race was that his son Beau was dying of cancer, and the vice president was focused on being with his son. But before he died in May, Dowd reported, Beau Biden tried to get his father to promise to run. Now Joe Biden is considering the idea.
An activist group is trying to discredit Planned Parenthood with covertly recorded videos even as contraception advocates are touting a method that sharply reduces unwanted pregnancies.
Abortion is back at the fore of U.S. politics due to an activist group’s attempt to discredit Planned Parenthood, one of the most polarizing organizations in the country. Supporters laud its substantial efforts to provide healthcare for women and children. For critics, nothing that the organization does excuses its role in performing millions of abortions––a procedure that they regard as literal murder––and its monstrous character is only confirmed, in their view, by covertly recorded video footage of staffers cavalierly discussing what to do with fetal body parts.
If nothing else, that recently released footage has galvanized Americans who oppose abortion, media outlets that share their views, and politicians who seek their votes. “Defunding Planned Parenthood is now a centerpiece of the Republican agenda going into the summer congressional recess,” TheWashington Postreports, “and some hard-liners have said they are willing to force a government shutdown in October if federal support to the group is not curtailed.”
The jobs that are least vulnerable to automation tend to be held by women.
Many economists and technologists believe the world is on the brink of a new industrial revolution, in which advances in the field of artificial intelligence will obsolete human labor at an unforgiving pace. Two Oxford researchers recently analyzed the skills required for more than 700 different occupations to determine how many of them would be susceptible to automation in the near future, and the news was not good: They concluded that machines are likely to take over 47 percent of today’s jobs within a few decades.
This is a dire prediction, but one whose consequences will not fall upon society evenly. A close look at the data reveals a surprising pattern: The jobs performed primarily by women are relatively safe, while those typically performed by men are at risk.
Blame Prohibitionists, German immigrants, and factory workers who just wanted to drink during their lunch break.
Today’s discerning beer drinkers might be convinced that America’s watery, bland lagers are a recent corporate invention. But the existence of American beers that are, as one industry executive once put it, “less challenging,” has a much longer history. In fact, Thomas Jefferson, himself an accomplished homebrewer, complained that some of his country’s beers were “meagre and often vapid” nearly 200 years ago.
Jefferson never lived to see the worst of it. Starting in about the mid-1800s, American beer has been defined by its dullness. Why? The answer lies in a combination of religious objections to alcohol, hordes of German immigrants, and a bunch of miners who just wanted to drink during their lunch break, says Ranjit Dighe, a professor of economics at the State University of New York at Oswego.
It’s impossible to “solve” the Iranian nuclear threat. This agreement is the next best thing.
Having carefully reviewed the lengthy and complex agreement negotiated by the United States and its international partners with Iran, I have reached the following conclusion: If I were a member of Congress, I would vote yes on the deal. Here are nine reasons why.
1) No one has identified a better feasible alternative. Before negotiations halted its nuclear advance, Iran had marched relentlessly down the field from 10 years away from a bomb to two months from that goal line. In response, the United States and its partners imposed a series of sanctions that have had a significant impact on Iran’s economy, driving it to negotiate. That strategy worked, and resulted in a deal. In the absence of this agreement, the most likely outcome would be that the parties resume doing what they were doing before the freeze began: Iran installing more centrifuges, accumulating a larger stockpile of bomb-usable material, shrinking the time required to build a bomb; the U.S. resuming an effort to impose more severe sanctions on Iran. Alternatively, Israel or the United States could conduct military strikes on Iran’s nuclear facilities, setting back the Iranian program by two years, or perhaps even three. But that option risks wider war in the Middle East, an Iran even more determined to acquire a bomb, and the collapse of consensus among American allies.