Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
The neurologist leaves behind a body of work that reveals a lifetime of asking difficult questions with empathy.
Oliver Sacks always seemed propelled by joyful curiosity. The neurologist’s writing is infused with this quality—equal parts buoyancy and diligence, the exuberant asking of difficult questions.
More specifically, Sacks had a fascination with ways of seeing and hearing and thinking. Which is another way of exploring experiences of living. He focused on modes of perception that are delightful not only because they are subjective, but precisely because they are very often faulty.
To say Sacks had a gift for this method of exploration is an understatement. He was a master at connecting curiosity to observation, and observation to emotion. Sacks died on Sunday after receiving a terminal cancer diagnosis earlier this year. He was 82.
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
On the desperation behind the migrant tragedy in Austria
On Thursday, as Krishnadev Calamur has been tracking in The Atlantic’s new Notes section, Austrian authorities made a ghastly discovery: a truck abandoned in the emergency lane of a highway near the Hungarian border, packed with the decomposing bodies of 59 men, eight women, and four children. They are thoughtto be the corpses of migrants who suffocated to death, perhaps two days earlier, in the bowels of a vehicle whose back door was locked shut and refrigeration and ventilation systems weren’t functional. Stray identity documents suggest that at least some of the victims were Syrian—refugees from that country’s brutal civil war. The truck featured an image of a chicken and a slogan from the Slovakian poultry company that the lorry once belonged to: “I taste so good because they feed me so well.”
This is a low-stress way to ease back into the ancient art of blogging. Our leaders, from J.J. Gould to Chris Bodenner, have explained the logic behind this new feature on The Atlantic’s site here. My colleagues, including Ta-Nehisi Coates and Jeffrey Goldberg, have gotten into the swing of things as you will read.
What it's like to look for romance when "a big smile can be frightening"
The way to Paulette's heart is through her Outlook calendar. “Honestly, if you want to be romantic with me, send an email through Outlook and give me all the possible dates, locations, and times, so that I can prepare,” she said.
The former Miss America system contestant and University of Cincinnati College-Conservatory of Music-trained opera singer knew she had a different conception of romance than her previous boyfriends had and, for that matter, everyone else.
“People tend to think of romance as spur of the moment and exciting,” she told me. “I think of romance as things that make sense and are logical.” However, she didn't know why until this year when, at the age of 31, when she was diagnosed with autism.
Grasses—green, neatly trimmed, symbols of civic virtue—shaped the national landscape. They have now outlived their purpose.
The hashtag #droughtshaming—which primarily exists, as its name suggests, to publicly decry people who have failed to do their part to conserve water during California’s latest drought—has claimed many victims. Anonymous lawn-waterers. Anonymous sidewalk-washers. The city of Beverly Hills. The tag’s most high-profile shamee thus far, however, has been the actor Tom Selleck. Who was sued earlier this summer by Ventura County’s Calleguas Municipal Water District for the alleged theft of hydrant water, supposedly used to nourish his 60-acre ranch. Which includes, this being California, an avocado farm, and also an expansive lawn.
The case was settled out of court on terms that remain undisclosed, and everyone has since moved on with their lives. What’s remarkable about the whole thing, though—well, besides the fact that Magnum P.I. has apparently become, in his semi-retirement, a gentleman farmer—is how much of a shift all the Selleck-shaming represents, as a civic impulse. For much of American history, the healthy lawn—green, lush, neatly shorn—has been a symbol not just of prosperity, individual and communal, but of something deeper: shared ideals, collective responsibility, the assorted conveniences of conformity. Lawns, originally designed to connect homes even as they enforced the distance between them, are shared domestic spaces. They are also socially regulated spaces. “When smiling lawns and tasteful cottages begin to embellish a country,” Andrew Jackson Downing, one of the nation’s first landscaper-philosophers, put it, “we know that order and culture are established.”
Residents of Newtok, Alaska voted to relocate as erosion destroyed their land. That was the easy part.
NEWTOK, Alaska—Two decades ago, the people of this tiny village came to terms with what had become increasingly obvious: They could no longer fight back the rising waters.
Their homes perched on a low-lying, treeless tuft of land between two rivers on Alaska’s west coast, residents saw the water creeping closer every year, gobbling up fields where they used to pick berries and hunt moose. Paul and Teresa Charles watched from their blue home on stilts on Newtok’s southern side as the Ninglick River inched closer and closer, bringing with it the salt waters of the Bering Sea.
“Sometimes, we lose 100 feet a year,” Paul Charles told me, over a bowl of moose soup.
Many communities across the world are trying to stay put as the climate changes, installing expensive levees and dikes and pumps, but not Newtok, a settlement of about 350 members of the Yupik people. In 1996, the village decided that fighting Mother Nature was fruitless, and they voted to move to a new piece of land nine miles away, elevated on bedrock.
Protofeather fossils discovered entombed in amber from the Late Cretaceous era support theories of dinosaur and avian evolution
Protofeather fossils discovered entombed in amber from the Late Cretaceous era support theories of dinosaur and avian evolution -- and make for one beautiful gallery
Dinosaur and bird feathers preserved in amber from a Late Cretaceous site in Canada reveal new insights into the structure, function, and color of animals that date back to about 78 million years ago.
Researchers led by University of Alberta paleontologist Ryan McKellar say these specimens represent distinct stages of feather evolution, from early-stage, single filament protofeathers to much more complex structures associated with modern diving birds. After analyzing the preserved pigment cells, the authors add that these feathered creatures may have also had a range of transparent, mottled, and diffused colors, similar to birds today. They can't determine which feathers belonged to birds or dinosaurs yet, but they did observe filament structures that are similar to those seen in other non-avian dinosaur fossils. Their findings appear in the current issue of the journal Science.