OkCupid, which according to The Boston Globe aspires to be the Google of online dating, has been particularly aggressive about tracking users. The company’s goal is to stimulate “three-ways”—a double entendre that, to someone at OkCupid, means a person sent a note, received a reply, and fired off a follow-up.
“Imagine if you had a video camera at every bar in the country,” Sam Yagan, a co-founder of OkCupid, told me. “You’d have all these data that reveal things about society and predict them. This isn’t a survey. It isn’t a lab experiment. These are millions of people going about their lives. We just happen to be able to track and quantify everything about it.”
The company can quantify things you could guess but might rather not prove. For instance, all races of women respond better to white men than they should based on the men’s looks. Black women, as a group, are the least likely to have their missives returned, but they are the most likely to respond to messages.
I asked Yagan whether OkCupid might try tailoring its algorithm to surface more statistically successful racial combinations. Such a measure wasn’t out of the question, he said. “Imagine we did a lot of research, and we found that there were certain demographic or psychographic attributes that were predictors of three-ways. Hispanic men and Indian women, say,” Yagan suggested. “If we thought that drove success, we could tweak it so those matches showed up more often. Not because of a social mission, but because if it’s working, there needs to be more of it.”
Imagine the reverse, though, in the past or future. What if the dating sites had existed in the 1950s? How would they have dealt with interracial matches? Given the female response to white men in 2010, should white men show up more often? “We could do some really screwed-up things,” Yagan admits. Imagine war broke out with China, causing Chinese users’ ratings to plummet: would dating Web sites start reducing the number of Chinese people showing up in other groups’ searches?
Algorithms are made to restrict the amount of information the user sees—that’s their raison d’être. By drawing on data about the world we live in, they end up reinforcing whatever societal values happen to be dominant, without our even noticing. They are normativity made into code—albeit a code that we barely understand, even as it shapes our lives.
We’re not going to stop using algorithms. They’re too useful. But we need to be more aware of the algorithmic perversity that’s creeping into our lives. The short-term fit of a dating match or a Web page doesn’t measure the long-term value it may hold. Statistically likely does not mean correct, or just, or fair. Google-generated kadosh is meretricious, offering a desiccated kind of choice. It’s when people deviate from what we predict they’ll do that they prove they are individuals, set apart from all others of the human type.