When a little boy, with some prompting from his mom, asked Texas Gov. Rick Perry how old he thought the earth was, Perry responded that he wasn't sure -- that evolution is just "a theory that's out there" and that's why Texas teaches creationism*. "Ask him why he doesn't believe in science!" the kid's mom pretended to whisper to her son, though the comment was clearly directed at Perry. But it turns out Perry does believe in science after all--when it's applied to politics. As Sasha Issenberg explains to The New York Times' David Leonhardt, Perry's 2006 campaign imported "eggheads" from academia to apply the scientific method to classic campaign tools.
Issenberg explains how the experiments began:
As the 2006 election season approached, the governor's top strategist, Dave Carney, invited four political scientists into Perry's war room and asked them to impose experimental controls on any aspect of the campaign budget that they could randomize and measure. Over the course of that year, the eggheads, as they were known within the campaign, ran experiments testing the effectiveness of all the things that political consultants do reflexively and we take for granted: candidate appearances, TV ads, robocalls, direct mail. These were basically the political world's version of randomized drug trials, which had been used by academics but never from within a large-scale partisan campaign.
The eggheads controlled Perry's schedule for three days and randomly assigned his travel across Texas. During that time, they conducted a massive volume of polling calls -- large enough to discern significant movement in each city -- and tracked contributions and volunteer activity. They found that Perry's presence in a city had an impact: his approval ratings went up, and contributions and volunteer signups increased after he did a public event.Because they had randomized the schedule, the eggheads were able with confidence to attribute the changes to Perry's presence.
Like Carney, Plouffe is a ruthless empiricist who helped instill an analytical culture that pervaded the Obama organization. Everything that could be measured was measured, and data was used to judge the relative effectiveness of campaign techniques, both old and new.The big difference was in methodology. As best I can tell, the Obama campaign never used randomized trials to test its operations offline. ... That reflects the fact that most of the people doing analytics within the Obama campaign, unlike the eggheads, didn't come out of the academic social sciences, where after 2000 randomized field experiments became an increasingly popular tool for measuring basic political communication techniques. The great use of randomized trials -- as in medicine or agriculture or development policy -- is that they can not only find relationships between sets of data, but can explain causality. So there's a different level of authority in some of the Perry findings than anything that came out of Obama's headquarters, which helps to explain why Perry already leads Obama in at least one head-to-head metric: the number of scholarly papers published based on campaign research.
This article is from the archive of our partner The Wire.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.