In early 2015, a team of 56 volunteers knocked on the doors of conservative voters in Miami, Florida to talk about transgender rights. Local officials had recently passed a law that protected transgender people from discrimination, but LGBT organizations were concerned about backlash, repeals, and counter-legislation (of the kind recently seen in North Carolina).
So volunteers from the Los Angeles LGBT Center and SAVE, a Florida LGBT organization asked voters what they thought about the recent law? Would they watch this video and talk about their reactions? Could they talk about a time when they had been on the receiving end of negative judgment or stigma? Did that help them to understand what a transgender life is like? Did that change their views?
It was a deliberate strategy, and it worked—durably and dramatically. These ten-minute conversations, known as “deep canvassing,” substantially reduced prejudice against transgender people for at least three months, even in the face of anti-transgender ad campaigns. Not all the voters were swayed, but on average, they experienced a drop in transphobia greater than the fall in homophobia among average Americans from 1998 to 2012. The canvassers, through ten-minute chats, had produced the equivalent of 14 years of social change.
That might seem eerily familiar because it’s the same punchline from one of the most infamous cases of scientific fraud in the last few years. In December 2014, political scientist Michael LaCour published a paper in which he supposedly evaluated the Los Angeles LGBT Center’s canvassers and found that they strongly and persistently reduced prejudices against same-sex marriage. Five months later, it turned out that his data was fabricated. His paper, exposed as a sham, was swiftly retracted.
The scandal was a huge blow to the LGBT Center—but the new study, published in the same journal, offers redemption. Their methods do work. And in an extra twist, the vindicating results come from the same researchers who uncovered LaCour’s fraud: David Broockman from Stanford University, and Joshua Kalla from the University of California, Berkeley.
They evaluated the same deep canvassing technique, and found even stronger results. For example, Lacour claimed that the conversations only work if the canvassers are themselves gay; Broockman and Kalla (after, y’know, actually doing some science) found that both transgender and non-transgender canvassers could change minds.
“This is something that other practitioners now know with confidence that they can adopt because they know it works,” says Broockman.
* * *
In November 2008, Californians voted to pass Proposition 8, an amendment that would stop same-sex couples from marrying. It was a huge blow for the LGBT community, not least because all the preceding polling suggested that the proposition would be defeated. “People had really expected to win,” says Dave Fleischer, now director of the Leadership Lab at the L.A. LGBT Center. “In the wake of it, people really wanted to do something, but they didn’t know what.”
He decided, simply, to talk to the people who had voted for the proposition. “I felt really smart because it was a great idea,” he tells me. “And I felt like an idiot because I’m 61 and I’ve been on the losing side of many elections, but it had never occurred to me to do this before.”
Over the next year and 13,000 conversations, Fleischer’s team of volunteers tried many strategies for swinging conservative voters to their side. Almost all of them failed. “One very popular idea was to tell our own stories; we thought that by itself would be enough,” he recalls. It wasn’t. “But when we tell a story and make ourselves vulnerable, it makes it easier for the voter to decide that we’re not going to judge them. And it makes it easier for us to elicit their story. That turns out to be by far the most important thing we do. Almost everything we do now is in service of getting the voter to honestly relive their experiences, recall it aloud, and reflect on it.”
Fleischer was on to something; he knew it. His own internal assessments told him that they were changing opinion, but he didn’t know to what degree, or how long the changes would last. He wanted an independent scientist to evaluate their approach. Enter LaCour. He randomly assigned voters to be canvassed on either gay marriage or recycling, and he was to survey them before and after. But while the canvassers did their thing, LaCour did nothing. And he might have got away with it too, if it weren’t for Broockman and Kalla.
Broockman studied under Donald Green from Yale University, the second author on the LaCour paper (and who was uninvolved in the fraud). He had long been interested in politics, activism, and door-to-door canvassing. And he had gone into research specifically to forge better links between political activists doing frontline work, and political academics studying them from afar. LaCour’s study sat squarely in his wheelhouse. When he heard about the results in 2013, he was deeply impressed.
He was equally intrigued by the L.A. LGBT Center and hatched a collaboration with Fleischer. Broockman would follow the canvassers to Miami to evaluate whether their efforts are countering transphobia, and replicating the work that LaCour had supposedly done in California. But when he looked at the minutiae of LaCour’s study, he found troubling holes. “It’s like when a friend gives you a great recipe at a dinner party, and when you try to follow it, you say: Wait, there’s a lot that’s vague here. We tried to track down the details, which is when we learned that there were no details!”
After a thorough investigation, Broockman, Kalla, and Peter Aronow, wrote a 27-page takedown, showing that LaCour couldn’t possibly have carried out the work he claimed to. They posted it online on May 19, 2015. Nine days later, LaCour’s paper was retracted.
“I don’t know if you’ve ever had the experience of working with someone for a couple of years and discovering that everything they’ve told you is a lie, but I had not [until then]. And it was dreadful” says Fleischer. “It felt like a punch in the gut.” Worse still, the fraud threatened to derail the team’s work. It gave ammo to those who had always been skeptical that a short conversation could really change behavior. The implication was: Obviously, the study was a sham.
That just motivated Fleischer. “We never lost faith that we were on track to developing something good,” he says, “but we didn’t know if we were 10 percent of the way there or 80 percent.”
Broockman felt the same. Psychological theory told him that people are more likely to change their minds if they think something through carefully and actively, and if they try and imagine themselves in someone else’s shoes. These two principles—active processing and perspective-taking—are old parts of the psychological canon, but Fleischer’s team had effectively rediscovered them through their door-to-door work. They specifically tried to get voters to talk through their opinions, and to put themselves in the shoes of transgender people. One man, a military veteran, found common ground between his inability to find jobs because of PTSD and the discrimination that transgender people face.
Their techniques might be old, but “Broockman and Kalla show how they can be scaled to reach—and impact—large numbers of people in relatively short periods of time,” says Hahrie Han from University of California, Santa Barbara. “Their findings challenge common campaign strategies that try to transactionally reach the largest quantity of voters possible without taking into the consideration the quality or authenticity of the interactions they have.”
“Canvassing organizations typically have quick and shallow conversations,” says Broockman. “They’ll read some talking point off a page and say, ‘Does that sound good to you? You should vote!’ The LGBT Center was trying to have a lasting impact.” And, as he showed, they’re succeeding. “They suffered a terrible blow when LaCour’s panel surveys turned out to be phony, as their outreach efforts were written off by many as naive,” says Donald Green. “Now they have a proper scholarly evaluation of their innovative and important work.”
* * *
For their own trial, Broockman and Kalla mailed an ostensibly unrelated baseline survey to 68,000 Miami voters, and then randomized 1,825 of them to be canvassed either about transgender issues or recycling. Following the conversations, Broockman and Kalla sent out follow-up surveys after 3 days, 3 weeks, 6 weeks, and 3 months.
These revealed that the canvassing was working. On average, voters became more positive towards transgender people, and more supportive of the recent non-discrimination law. And contrary to what LaCour claimed, all the canvassers, transgender or no, were effective at changing minds. “My view as a gay person is that it’s not that surprising,” says Broockman.
He was more surprised that the effects lasted. “The moment I backed away from my computer was when I saw the three-week results, and they were just as good as the three-day results—and so were the six-week and three-month ones,” he says. Even when the voters saw anti-transgender political ads, they were still more supportive of transgender rights than they used to be. And six weeks later, the ads’ effects had evaporated but the canvassing was still leaving its mark.
“There are very few rigorous randomized trials on prejudice reduction and most of the effective interventions are very involved,” says David Nickerson from Temple University. “[This study] is nearly unique in that it is rigorous, the intervention is simple, and the effects are durable.” In a related commentary, Elizabeth Levy Paluck from Princeton University notes that 60 percent of previous studies on prejudice reduction involved no experiments, 29 percent were confined to laboratories, only 11 percent took place in the real world, and only a fraction of those involved adults. Broockman and Kalla’s results “stand alone,” she writes.
She also rejects the idea that such a brief chat couldn’t have such dramatic results. “The 10 minutes consisted of a conversation with a stranger about a memory of personal vulnerability and its relevance to a social issue,” she writes. “We might question whether [it] is in fact unusually minor.”
Indeed, the counter-intuitive success of the study says something rather tragic about the way we talk to one another. “The thing is that people are very rarely listened to, in a way that makes them really think through the decisions they make in their lives,” says Broockman. “Yes, it’s a relatively brief interaction but it’s very different to what people experience in their day-to-day lives.” It’s not that different to cognitive-behavioral therapy.
And “on most people, it doesn’t work,” he adds. “Our best guess is that we’re affecting one in ten people. It’s not like some panacea.”
Wary of the attention that will undoubtedly be heaped upon his work, Broockman plans on posting his data and materials online. In the meantime, he is thinking about looking at other contentious issues, including climate change, vaccination, and gun control. And he is optimistic.
“A lot of people thought that the LaCour episode meant that we should doubt science,” he says. “But if anyone says to you, ‘The great thing about my field is that no mistakes are made and no fraud is done,’ you should think, ‘That just means you can’t detect it.’” Foul play will happen. But the fact that the LaCour fraud was uncovered, and ultimately overwritten by a better study “is a success story of self-correction,” he says. “I think it means that science is getting better at creating the norms and institutions it needs to create unbiased, rigorous knowledge.”
And his new results, rather than being the epilogue to last year’s misconduct, is “just the prologue of a lot more to come using this canvassing method,” he says. “We’re working to produce knowledge that people can use to make the world a better place.”