Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
When President Obama left, I stayed on at the National Security Council in order to serve my country. I lasted eight days.
In 2011, I was hired, straight out of college, to work at the White House and eventually the National Security Council. My job there was to promote and protect the best of what my country stands for. I am a hijab-wearing Muslim woman––I was the only hijabi in the West Wing––and the Obama administration always made me feel welcome and included.
Like most of my fellow American Muslims, I spent much of 2016 watching with consternation as Donald Trump vilified our community. Despite this––or because of it––I thought I should try to stay on the NSC staff during the Trump Administration, in order to give the new president and his aides a more nuanced view of Islam, and of America's Muslim citizens.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
“No… it’s a magic potty,” my daughter used to lament, age 3 or so, before refusing to use a public restroom stall with an automatic-flush toilet. As a small person, she was accustomed to the infrared sensor detecting erratic motion at the top of her head and violently flushing beneath her. Better, in her mind, just to delay relief than to subject herself to the magic potty’s dark dealings.
It’s hardly just a problem for small people. What adult hasn’t suffered the pneumatic public toilet’s whirlwind underneath them? Or again when attempting to exit the stall? So many ordinary objects and experiences have become technologized—made dependent on computers, sensors, and other apparatuses meant to improve them—that they have also ceased to work in their usual manner. It’s common to think of such defects as matters of bad design. That’s true, in part. But technology is also more precarious than it once was. Unstable, and unpredictable. At least from the perspective of human users. From the vantage point of technology, if it can be said to have a vantage point, it's evolving separately from human use.
Tucker Carlson’s latest reinvention is guided by a simple principle—a staunch aversion to whatever his right-minded neighbors believe.
Tucker Carlson is selling me hard on the swamp. It is an unseasonably warm afternoon in late January, and we are seated at a corner table in Monocle, an upscale Capitol Hill eatery frequented by the Fox News star. (Carlson, who typically skips breakfast and spends dinnertime on the air, is a fan of the long, luxurious, multi-course lunch, and when I requested an interview he proposed we do it here.) As we scan the menus, I mention that I’ll be moving soon to the Washington area, and he promptly launches into an enthusiastic recitation of the district’s many virtues and amenities.
“I’m so pathetically eager for people to love D.C.,” he admits. “It’s so sad. It’s like I work for the chamber of commerce or something.”
The polymath computer scientist David Gelernter’s wide-ranging ideas about American life.
Last month, David Gelernter, the pioneering Yale University computer scientist, met with Donald Trump to discuss the possibility of joining the White House staff. An article about the meeting in TheWashington Post was headlined, “David Gelernter, fiercely anti-intellectual computer scientist, is being eyed for Trump’s science adviser.”
It is hard to imagine a more misleading treatment.
By one common definition, anti-intellectualism is “hostility towards and mistrust of intellect, intellectuals, and intellectual pursuits, usually expressed as the derision of education, philosophy, literature, art, and science, as impractical and contemptible.”
Here is the exchange that I had with Gelernter when I reached out to ask if he would be interested in discussing the substance of his views on science, politics and culture.
A new report explores why those who benefitted from Obamacare’s Medicaid expansion supported the man who promised to reverse it.
Here’s a question that’s baffled health reporters in the months since the election: Why would people who benefit from Obamacare in general—and its Medicaid expansion specifically—vote for a man who vowed to destroy it?
Some anecdotal reports have suggested that people simply didn’t understand that the benefits they received were a result of the Affordable Care Act. That was the case for one Indiana family The New York Times described in December:
Medicaid has paid for virtually all of his cancer care, including a one-week hospitalization after the diagnosis, months of chemotherapy, and frequent scans and blood tests.
But Mr. Kloski and his mother, Renee Epperson, are still not fans of the health law over all. They believed that it required that Mr. Kloski be dropped, when he turned 26, from the health plan his mother has through her job at Target — not understanding that it was the law that kept him on the plan until he was 26.
In response, some GOP members of Congress are attempting to show sympathy for voter concerns.
In their districts this week, Republican members of Congress are facing pushback from angry town-hall crowds over the potential repeal of the Affordable Care Act. Some lawmakers are offering up a degree of sympathy in response, whether by defending the right to protest or attempting to convince voters they understand their concerns.
Republican Senator Tom Cotton told an agitated town hall audience in Arkansas on Wednesday that he wouldn’t deny that “Obamacare has helped many Arkansans,” after a woman said the law saved her life. When another woman insisted she wasn’t a “paid protester,” the senator tried to reassure the crowd that wasn’t a charge he planned to make: “You’re all Arkansans and I’m glad to hear from you,” he said. “Thank you to everyone for coming out tonight, whether you agree with me or disagree with me. This is part of what our country is all about.”
Liberals may need to decide whether to focus on energizing their base or expanding their coalition.
Democratic Senator Claire McCaskill, who is up for reelection in the red state of Missouri in 2018, recently told a St. Louis radio host she may face a primary challenge. “I may have a primary because there is, in our party now, some of the same kind of enthusiasm at the base that the Republican Party had with the Tea Party,” she said during an interview earlier this month. “Many of those people are very impatient with me because they don’t think I’m pure,” she added.
As the Democratic Party contemplates what’s next in the wake of its defeat in the presidential election, liberals may have to decide what matters more: Building a big tent party where far-left voters and moderate centrists can co-exist even if they occasionally disagree on policy and strategy, or focusing on the demands of the party’s progressive base, potentially creating a more like-minded and ideologically rigid coalition in the process.
All in all, the United States has already set more than 2,800 new record high temperatures this month. It has only set 27 record lows.
Most people handle this weather as the gift it is: an opportunity to get outside, run or bike or play catch, and get an early jump on the spring. But for the two-thirds of Americans who are at least fairly worried about global warming, the weather can also prompt anxiety and unease. As one woman told the Chicago Tribune: “It’s scary, that’s my first thing. Because in all my life I’ve never seen a February this warm.” Or as one viral tweet put it:
Neil Gaiman’s remarkable new book has triggered a debate about who, exactly, owns pagan tales.
Myths are funny. Unlike histories, they are symbolic narratives; they deal with spiritual rather than fact-based truths. They serve as foundations for beliefs, illustrating how things came to be and who was involved, but they’re often sketchy about when or why. There’s a brief scene from Neil Gaiman’s new book Norse Mythology that does a remarkable job of capturing just this: the wonderfully nebulous sense of being in illo tempore—the hazy “at that time” of the mythic past. It begins, as many creation myths do, with “an empty place waiting to be filled with life,” but in this instance some life already exists. There’s Ymir, whose enormous body produces all giants and, eventually, the earth, skies, and seas. There’s Audhumla, the celestial cow, who licks the first gods out of blocks of ice. And there are three brothers—the gods Ve, Vili, and Odin—who must devise a way out of this timeless nowhere: