Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
Einstein’s gravitational waves rest on a genuinely radical idea.
After decades of anticipation, we have directly detected gravitational waves—ripples in spacetime traveling at the speed of light through the universe. Scientists at LIGO (the Laser Interferometic Gravitational-wave Observatory) have announced that they have measured waves coming from the inspiral of two massive black holes, providing a spectacular confirmation of Albert Einstein’s general theory of relativity, whose hundredth anniversary was celebrated just last year.
Finding gravitational waves indicates that Einstein was (once again) right, and opens a new window onto energetic events occurring around the universe. But there’s a deeper lesson, as well: a reminder of the central importance of locality, an idea that underlies much of modern physics.
The revolution that ended the reign of beards occurred on September 30, 331 b.c., as Alexander the Great prepared for a decisive showdown with the Persian emperor for control of Asia. On that day, he ordered his men to shave. Yet from time immemorial in Greek culture, a smooth chin on a grown man had been taken as a sign of effeminacy or degeneracy. What can explain this unprecedented command? When the commander Parmenio asked the reason, according to the ancient historian Plutarch, Alexander replied, “Don’t you know that in battles there is nothing handier to grasp than a beard?” But there is ample cause to doubt Plutarch’s explanation. Stories of beard-pulling in battles were myth rather than history. Plutarch and later historians misunderstood the order because they neglected the most relevant fact, namely that Alexander had dared to do what no self-respecting Greek leader had ever done before: shave his face, likening himself to the demigod Heracles, rendered in painting and sculpture in the immortal splendor of youthful, beardless nudity. Alexander wished above all, as he told his generals before the battle, that each man would see himself as a crucial part of the mission. They would certainly see this more clearly if each of them looked more like their heroic commander.
Most people know how to help someone with a cut or a scrape. But what about a panic attack?
Here’s a thought experiment: You’re walking down the street with a friend when your companion falls and gashes her leg on the concrete. It’s bleeding; she’s in pain. It’s clear she’s going to need stitches. What do you do?
This one isn’t exactly a head-scratcher. You'd probably attempt to offer some sort of first-aid assistance until the bleeding stopped, or until she could get to medical help. Maybe you happen to have a Band-Aid on you, or a tissue to help her clean the wound, or a water bottle she can use to rinse it off. Maybe you pick her up and help her hobble towards transportation, or take her where she needs to go.
Here’s a harder one: What if, instead of an injured leg, that same friend has a panic attack?
When four American women were murdered during El Salvador’s dirty war, a young U.S. official and his unlikely partner risked their lives to solve the case.
On December 1, 1980, two American Catholic churchwomen—an Ursuline nun and a lay missionary—sat down to dinner with Robert White, the U.S. ambassador to El Salvador. They worked in rural areas ministering to El Salvador’s desperately impoverished peasants, and White admired their commitment and courage. The talk turned to the government’s brutal tactics for fighting the country’s left-wing guerrillas, in a dirty war waged by death squads that dumped bodies in the streets and an army that massacred civilians. The women were alarmed by the incoming Reagan administration’s plans for a closer relationship with the military-led government. Because of a curfew, the women spent the night at the ambassador’s residence. The next day, after breakfast with the ambassador’s wife, they drove to San Salvador’s international airport to pick up two colleagues who were flying back from a conference in Nicaragua. Within hours, all four women would be dead.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
Ben Stiller’s follow-up to his own comedy classic is a downright bummer, no matter how many celebrity cameos it tries to cram in.
You don’t need to go to the theater to get the full experience of Zoolander 2. Simply get your hands on a copy of the original, watch it, and then yell a bunch of unfunny topical lines every time somebody tells a joke. That’s how it feels to watch Ben Stiller’s sequel to his 2001 spoof of the fashion industry: Zoolander 2 takes pains to reference every successful gag you remember from the original, and then embellish them in painful—often offensive, almost always outdated—fashion. It’s a film that has no real reason to exist, and it spends its entire running time reaffirming that fact.
The original Zoolander, to be fair, had no business being as funny as it was—it made fun of an industry that already seems to exist in a constant state of self-parody, and much of its humor relied on simple malapropisms and sight gags. But it was hilarious anyway as a candid snapshot of the fizzling-out of ’90s culture. Like almost any zeitgeist comedy, it belonged to a particular moment—and boy, should it have stayed there. With Zoolander 2, Stiller (who directed, co-wrote, and stars) tries to recapture the magic of 2001 by referencing its past glories with increasing desperation, perhaps to avoid the fact that he has nothing new to say about the fashion industry or celebrity culture 15 years laters.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The bureau successfully played the long game in both cases.
The story of law enforcement in the Oregon standoff is one of patience.
On the most obvious level, that was reflected in the 41 days that armed militia members occupied the Malheur National Wildlife Refuge near Burns. It took 25 days before the FBI and state police moved to arrest several leaders of the occupation and to barricade the refuge. It took another 15 days before the last of the final occupiers walked out, Thursday morning Oregon time.
Each of those cases involved patience as well: Officers massed on Highway 395 didn’t shoot LaVoy Finicum when he tried to ram past a barricade, nearly striking an FBI agent, though when he reached for a gun in his pocket they finally fired. Meanwhile, despite increasingly hysterical behavior from David Fry, the final occupier, officers waited him out until he emerged peacefully.
The country’s growth is slowing. The wrong response might make the problem worse.
An anxious superpower is confounded by a troubled economy. For a generation, its growth has been envied; now that growth is decelerating sharply. For decades, it has shaped and guided its economy via tight control of its banks; now that lever is malfunctioning. For years, it has carefully managed its exchange rate and limited the flow of capital across its borders; now the dam is cracking. To anyone who keeps up with the news, the superpower would seem easy to identify: China. But for those with a long memory, it could just as well be the United States of the Nixon era.
Like China today, the United States of the 1970s experienced an abrupt economic slowdown. Its economy had expanded by 4.4 percent a year, on average, during the go-go ’50s and ’60s, but growth slowed by about one-quarter during the following decade, to 3.2 percent a year. Even though growth of more than 3 percent may sound robust by today’s standards, at the time it felt ghastly. Time magazine lamented in 1974 that “middle-class people are being pushed into such demeaning economies as buying clothes at rummage sales”; a year or so later, its cover asked, “Can Capitalism Survive?” In September 1975, after President Gerald Ford survived two attempts on his life in quick succession, an adviser named Alan Greenspan responded with a memo about the “nihilism, radicalism, and violence” that seemed to grip some Americans. When New York City flirted with bankruptcy, its plight was taken as a symbol of broader moral and cultural decay.