Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
At least they didn’t go for the giant, spring-loaded needle traps.
The events that led teams of helicopter-borne vets to pelt the Swiss countryside with vaccine-impregnated chicken heads began in 1939. Two things were then sweeping through Poland: the Nazis, and an epidemic of rabies carried by red foxes. Every year, the wavefront of disease advanced southward and westward by several dozen kilometers, hitting country after country. In March of 1967, it reached Switzerland.
The epidemic was a huge problem. Rabies is caused by a virus that spreads through the bites of animals and targets the brain. Unless infected people get a (really expensive) vaccine right away, the disease is almost always fatal. So, something had to be done about the foxes. The usual methods—poisoning, trapping, gassing, and shooting—weren’t working. The alternative was to vaccinate them.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
Two scholars discuss the ups and downs of life as a right-leaning professor.
“I don’t think I can say it too strongly, but literally it just changed my life,” said a scholar, about reading the work of Ayn Rand. “It was like this awakening for me.”
Different versions of this comment appear throughout Jon A. Shields and Joshua M. Dunn Sr.’s book on conservative professors, Passing on the Right, usually about people like Milton Friedman and John Stuart Mill and Friedrich Hayek. The scholars they interviewed speak in a dreamy way about these nerdy celebrities, perhaps imagining an alternate academic universe—one where social scientists can be freely conservative.
The assumption that most college campuses lean left is so widespread in American culture that it has almost become a caricature: intellectuals in thick-rimmed glasses preaching Marxism on idyllic grassy quads; students protesting minor infractions against political correctness; raging professors trying to prove that God is, in fact, dead. Studies about professors’ political beliefs and voting behavior suggest this assumption is at least somewhat correct. But Shields and Dunn set out to investigate a more nuanced question: For the minority of professors who are cultural and political conservatives, what’s life actually like?
A pastor and a rabbi talk about kids, poop, and tearing down the patriarchy in institutional religion.
The Bible is a man’s book. It was mostly written by men, for men, and about men. The people who then interpreted the text have also been predominately male.
No wonder there’s not much theology preoccupied with weird-colored poop and the best way to weather tantrums. Throughout history, childcare has largely been considered women’s work—and, by extension, not theologically serious.
Danya Ruttenberg—a Conservative rabbi whose book about parenting came out in April—disagrees. So does Bromleigh McCleneghan, a Chicago-area pastor and the author of a 2012 book about parenting and a forthcoming book about Christians and sex. Both women have made their careers in writing and ministry. But they’re also both moms, and they believe the work they do as parents doesn’t have to remain separate from the work they do as theologians.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
Princeton freshman Tal Fortgang was right that "privilege" is a problem, but not about why.
Poor Tal Fortgang. (Well, perhaps “poor” isn’t the right word.) Not long ago, the Princeton freshman’s white male privilege was known only to those in his life. Then he published an essay about this privilege in a conservative student publication, arguing that because his ancestors had struggled, he personally doesn’t benefit from unearned advantage. If he’s not privileged, no one should be asking him to check his privilege, right? After all, some of his advantage was earned; he just doesn’t happen to be the one who earned it.
Because “privilege” is clickbait, Fortgang’s piece made the rounds, culminating in the New York Times interviewing his classmates about his privilege and whether he had, in fact, checked it. The consensus is that he did not. Fortgang’s privilege has now been checked not only by his classmates and Facebook friends but by the entire Internet.