Many times, at parties and in other conversations over the years, I
have vociferously defended fellow journalists against charges of bias
in their work. Particularly journalists working in the lowly field of
print journalism, as opposed to TV.
everyone in the field is perfect, unbiased, or even a good reporter.
And not that I haven't ever encountered an editor who really, really
wanted a story to say "X" as opposed to "Y." I remember one editor who
complained that a story I'd done about NASA test pilots didn't make
them sound like the wild cowboys he imagined they were.
(Unfortunately--or fortunately--the truth about test pilots is, they're
not cowboys. They're precision engineers and very calculated
risk-mitigators, hitting test cards with calm, methodical accuracy. The
risk isn't in their attitude. It's in the inherent hazards of testing
new technology under real conditions for the first time.)
within those caveats, I've always maintained that the majority of
professional print journalists, anyway, try very, very hard to get the
story right. But recently, I had an experience that gave me a new
perspective on the issue.
A few weeks ago, I
attended the public launch of a company's product that had, until that
point, been kept tightly under wraps. The product involved a
breakthrough approach and new technology that had the potential of
having a revolutionary impact on its industry, as well on consumers
around the world. Unlike most of the journalists covering the event, I
was not an expert on that particular industry. It wasn't my normal
"beat." The reason I was there was because I'd been interviewing the
company's CEO over the previous several months for a book project. But
that also meant that while I wasn't an expert about the industry in
general, I was in the odd position of knowing more about the company's
"secret" product than any other journalist in the room.
was an eye-opening experience. A lot of major news outlets and
publications were represented at the press conference following the
announcement. A few very general facts about the product had been
released, but the reporters had only been introduced to details about
it a half hour earlier. There was still a lot about how it worked, how
it differed from other emerging products, and why the company felt so
confident about its evolution and economic viability, that remained to
But the reporters' questions
weren't geared toward getting a better understanding of those points.
They were narrowly focused on one or two aspects of the story. And from
the questions that were being asked, I realized--because I had so much
more information on the subject--that the reporters were missing a
couple of really important pieces of understanding about the product
and its use. And as the event progressed, I also realized that the
questions that might have uncovered those pieces weren't being asked
because the reporters already had a story angle in their heads and were
focused only on getting the necessary data points to flesh out and back
up what they already thought was the story.
is always a tension, as a journalist, between asking open-ended
questions that allow an interview subject to explain something and
pressing or challenging them on accuracy or details. But if you think
you already know the subject, or already have a story angle half-formed
in your head, it's easy to overlook the first part.
journalists at the press conference didn't have a bias as the term is
normally used; that is, I didn't get the sense that they were
inherently for or against the company or its product. They just
appeared to think they knew the subject well enough, or had a set
enough idea in their heads as to what this kind of story was about,
that they pursued only the lines of questioning necessary to fill in
the blanks of that presumed story line. As a result, they left the
press conference with less knowledge and understanding than they
otherwise might have had. And while nobody could have said the
resulting stories were entirely wrong, they definitely suffered
from that lapse. Especially, as might be expected, when it came to the
predictions they made about the product's evolution or future.
In his new book, How We Decide,
Jonah Lehrer cites a research study done by U.C. Berkeley professor
Philip Tetlock. Tetlock questioned 284 people who made their living
"commenting or offering advice on political and economic trends,"
asking them to make predictions about future events. Over the course of
the study, Tetlock collected quantitative data on over 82,000
predictions, as well as information from follow-up interviews with the
subjects about the thought processes they'd used to come to those
His findings were surprising.
Most of Tetlock's questions about the future events were put in the
form of specific, multiple choice questions, with three possible
answers. But for all their expertise, the pundits' predictions turned
out to be correct less than 33% of the time. Which meant, as Lehrer
puts it, that a "dart-throwing chimp" would have had a higher rate of
success. Tetlock also found that the least accurate predictions were
made by the most famous experts in the group.
Why was that? According to Lehrer,
"The central error diagnosed by Tetlock was the sin of certainty,
which led the 'experts' to impose a top-down solution on their
decision-making processes ... When pundits were convinced that they
were right, they ignored any brain areas that implied they might be
Tetlock himself, Lehrer
says, concluded that "The dominant danger [for pundits] remains hubris,
the vice of closed-mindedness, of dismissing dissonant possibilities
A friend of mine who's an editor at the New York Times
said those results don't surprise him at all. "If you watch a White
House press conference," he said, "you can tell who the new reporters
are. They're often the ones who ask the best questions." I must have
looked a little surprised. "Seriously," he said. "I actually think we
should rotate reporters' beats every two years, so nobody ever thinks
they're too much of an expert at anything."
an interesting idea. There's some advantage to having good background
in a subject, of course. For one thing, it takes a lot less time to
research and write a story if you at least know the general subject
matter and have tracked news developments in it over a period of time.
And while an expert can miss information because they assume they
already know what there is to know, a newcomer can miss information
from not knowing enough to know what there is to ask.
a tricky balance to try to strike--in part because assuming we know the
salient points of a topic or story isn't an obvious, conscious bias as
most people define or understand the term. Indeed, "practically all" of
the professionals in Tetlock's study claimed, and no doubt believed,
that they were dispassionately analyzing the evidence. But it's a
reminder that we all have, as Tetlock put it, the potential to become
"prisoners of our preconceptions." And that sometimes, even if we think
we know the story, it might be worth asking questions as if we don't.
Every now and then, we might hear or learn something that, as long as
we're open to hearing it, might change our minds about what the real
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.
Most campaign ads, like most billboards or commercials, are unimaginative and formulaic. Our candidate is great! Their candidate is terrible! Choose us!
With the huge majority of political ads, you would look back on them long after the campaign only for time-warp curio purposes—Look at the clothes they wore in the 80s! Look how corny “I like Ike!” was as a slogan! Look how young [Mitch McConnell / Bill Clinton / Al Gore] once was!—or to find archeological samples of the political mood of a given era.
The few national-campaign ads that are remembered earn their place either because they were so effective in shifting the tone of the campaign, as with George H. W. Bush’s race-baiting “Willie Horton” ad against Michael Dukakis in 1988; or because they so clearly presented the candidate in the desired light, as with Ronald Reagan’s famous “Morning in America” ad in 1984. Perhaps the most effective campaign advertisement ever, especially considering that it was aired only one time, was Lyndon Johnson’s devastating “Daisy Girl” ad, from his campaign against Barry Goldwater in 1964. The power of the Daisy Girl ad was of course its dramatizing the warning that Goldwater might recklessly bring on a nuclear war.
Who will win the debates? Trump’s approach was an important part of his strength in the primaries. But will it work when he faces Clinton onstage?
The most famous story about modern presidential campaigning now has a quaint old-world tone. It’s about the showdown between Richard Nixon and John F. Kennedy in the first debate of their 1960 campaign, which was also the very first nationally televised general-election debate in the United States.
The story is that Kennedy looked great, which is true, and Nixon looked terrible, which is also true—and that this visual difference had an unexpected electoral effect. As Theodore H. White described it in his hugely influential book The Making of the President 1960, which has set the model for campaign coverage ever since, “sample surveys” after the debate found that people who had only heard Kennedy and Nixon talking, over the radio, thought that the debate had been a tie. But those who saw the two men on television were much more likely to think that Kennedy—handsome, tanned, non-sweaty, poised—had won.
In Greenwich, Darien, and New Canaan, Connecticut, bankers are earning astonishing amounts. Does that have anything to do with the poverty in Bridgeport, just a few exits away?
BRIDGEPORT, Conn.—Few places in the country illustrate the divide between the haves and the have-nots more than the county of Fairfield, Connecticut. Drive around the city of Bridgeport and, amid the tracts of middle-class homes, you’ll see burned-out houses, empty factories, and abandoned buildings that line the main street. Nearby, in the wealthier part of the county, there are towns of mansions with leafy grounds, swimming pools, and big iron gates.
Bridgeport, an old manufacturing town all but abandoned by industry, and Greenwich, a headquarters to hedge funds and billionaires, may be in the same county, and a few exits apart from each other on I-95, but their residents live in different worlds. The average income of the top 1 percent of people in the Bridgeport-Stamford-Norwalk metropolitan area, which consists of all of Fairfield County plus a few towns in neighboring New Haven County, is $6 million dollars—73 times the average of the bottom 99 percent—according to a report released by the Economic Policy Institute (EPI) in June. This makes the area one of the most unequal in the country; nationally, the top 1 percent makes 25 times more than the average of the bottom 99 percent.
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
The Texas senator’s about-face risks undermining his political brand and alienating the supporters who hailed his defiant stand in Cleveland.
Ted Cruz set aside his many differences with Donald Trump on Friday to endorse for president a man whom he once called a “serial philanderer,” a “pathological liar,” “utterly amoral,” and a “sniveling coward”; who insulted his wife’s looks; who insinuated Cruz’s father was involved in the assassination of John F. Kennedy; who said he wouldn’t even accept his endorsement; and who for months mocked him mercilessly with a schoolyard taunt, “Lyin’ Ted.”
The Texas senator announced his support for the Republican nominee late Friday afternoon in a Facebook post, writing that the possibility of a Hillary Clinton presidency was “wholly unacceptable” and that he was keeping his year-old commitment to back the party’s choice. Cruz listed six policy-focused reasons why he was backing Trump, beginning with the importance of appointing conservatives to the Supreme Court and citing Trump’s recently expanded list of potential nominees. Other reasons included Obamacare—which Trump has vowed to repeal—immigration, national security, and Trump’s newfound support for Cruz’s push against an Obama administration move to relinquish U.S. oversight of an internet master directory of web addresses.
How Washington men working in national security dress—for better or for worse
In 2017, shortly after the next president is inaugurated, thousands of newly appointed federal officials will struggle with the same existential question: What do I wear to my first day of work? I understand their anxiety, having languished over wardrobe during eight years of federal service and pondered the fashion choices of my male colleagues during the interminable meetings that are the hallmark of government work. It’s hard to point to a solid “real world” professional competency that I learned during those years of meetings and memo writing, but one skill I developed is an uncanny ability to tell you where any man in the national security community works based on his apparel. But first, to understand the fashion choices these professionals make, you must understand the culture—and keep in mind that not every employee falls into these stereotyped camps. (I’m also leaving a thorough assessment of female fashion to other writers more qualified.)
Early photographs of the architecture and culture of Peking in the 1870s
In May of 1870, Thomas Child was hired by the Imperial Maritime Customs Service to be a gas engineer in Peking (Beijing). The 29-year-old Englishman left behind his wife and three children to become one of roughly 100 foreigners living in the late Qing dynasty's capital, taking his camera along with him. Over the course of the next 20 years, he took some 200 photographs, capturing the earliest comprehensive catalog of the customs, architecture, and people during China's last dynasty. On Thursday, an exhibition of his images will open at the Sidney Mishkin Gallery in New York, curated by Stacey Lambrow. In addition, descendants of the subjects of one of his most famous images, Bride and Bridegroom (1870s), will be in attendance.
Police in Charlotte, North Carolina, released body-cam and dashboard footage of the 43-year-old black man’s final moments.
Keith Scott had his hands at his side when a Charlotte, North Carolina, police officer fatally shot him four times, according to footage from a police dashboard camera.
The Charlotte-Mecklenburg Police Department released Saturday clips of body-cam and dashboard footage taken during Scott’s shooting Tuesday after days of protests in downtown Charlotte over the killing.
The two clips offer an incomplete glimpse into the encounter. Footage from the body-cam of one of the officers runs a minute long. Scott himself is shown for only a fraction of a second in it. During the shooting itself, the lens is obscured by the officer’s neck. The audio is also missing from the first 25 seconds, including when the gunshots are fired.
Campus life is too diverse at most schools for dorms to serve as a place of respite from uncomfortable ideas.
Last week, I got an email from Decker O’Donnell, an economics major at Lewis & Clark College in Portland, Oregon. He was troubled by my claim that dorm life at residential colleges cannot be like home. “We live there 30 weeks a year,” he wrote. “I know people with abusive or homophobic families who couch-surf in the summer.”
College, he observed, “is the only home they have.”
There is, of course, a subset of college students whose troubled home lives cause them to feel more comfortable on campus than in the households where they grew up, and the escape that higher education affords them is very much worth celebrating. But those cases are not the core of O’Donnell’s disagreement with me.
By way of background, I wrote about home during last year’s controversy at Yale, when students protested the faculty-in-residence at Silliman College after his wife sent an email that upset them. She argued that Yale undergrads, not administrators, should shape the norms around what Halloween costumes are appropriate. “As master,” a student retorted, “it is your job to create a place of comfort and home for the students who live in Silliman. You have not done that. By sending out that email, that goes against your position as master. Do you understand that?!”