In his novel Super Sad True Love Story, Gary Shteyngart imagined ubiquitous poles installed on sidewalks that display people’s credit scores as they walked by. Friends and strangers could instantly size one another up using a simple number. Shteyngart also imagined government turning these poles to its own ends—clumsily urging some citizens to spend, and others to save, based on their location and race.
Natasha Singer, a tech reporter for The New York Times, recently called the book a “fictional forecast of a data-deterministic culture in which computer algorithms constantly analyze consumers’ profiles.” That forecast is fast becoming reality, with deeply troubling consequences.
In its simplest form, credit scoring can be unfairly reductive. One person might fail to pay a bill because he or she was dealing with a medical emergency; another may simply be a deadbeat. But scores tend to treat both the same, unfairly flattening a real wrong and a sad but understandable failure to pay into a single, numerical hit to a person’s reputation that will affect whether they get credit, and how much that credit will cost. As one writer observed, “Your credit report is evaluated by computers. When you apply for a loan, there's not a guy sitting down reading your report and looking for a statement, saying, 'Oh, OK, she was sick and that explains it.' That's not happening anymore.” Though consumers have a right to annotate credit reports, such explanations matter little when simple scores govern lenders’ actions.
Though consumers have not been able to glimpse the actual algorithms for setting scores, some basic contours of credit scoring are intelligible: Don’t run up too much debt, and don’t be late on payments. But by 2009, financiers were scrutinizing more data, in ways completely opaque to scored consumers. Buy “little felt pads that stop chair legs from scratching the floor”? You might be rewarded with a higher credit line, or lower interest-rate offers. Purchase a beer at a billiards bar? Expect the opposite.
America’s obsession with scoring has gone far beyond credit. As Pam Dixon and Bob Gellman documented in their World Privacy Forum report “The Scoring of America,” thousands of scores exist, ranging from measurements of one’s reliability as an employee to one’s “medication adherence” as a patient. Most of these scores aren’t publicly available, and it’s obvious why: Most people don’t appreciate being scored by strangers, and might protest if they knew these secret evaluations were going on. The recent backlash against Peeple, an app aspiring to be the “Yelp for People,” showed just how strong the reaction could be once a feedback system got a critical mass of attention.
Yet the trend toward scoring continues, and in China, it has gone into overdrive. Jay Stanley, a senior policy analyst at the ACLU’s Speech, Privacy & Technology Project, has summarized a series of disturbing news stories on China’s “Planning Outline for the Construction of a Social Credit System.” As Stanley observes, “Among the things that will hurt a citizen’s score are posting political opinions without prior permission, or posting information that the regime does not like.” The scoring system will also penalize or reward individuals for what their friends post. A plan for basing individuals’ credit on the records of their friends, so far only conceptualized by Facebook, appears set for widespread implementation abroad.
The Chinese scoring systems are projected to probe well beyond political orthodoxy, too. According to one Chinese firm’s press release, its “Sesame Credit” service is “conducting a test program with a Chinese online dating site that allows suitors to check their potential dates’ credit ratings to make sure they are not meeting someone who is dishonest or untrustworthy.” Given the recent finding that “the higher your credit score, the higher your chances of a lasting relationship,” it’s easy to imagine a similar service influencing the black-boxed algorithms of American dating sites. Or it could simply be integrated into a new iteration of Google Glass, replacing 20th-century “beer goggles” with the pristine clarity of “credit score microscopes.”
Back in 2004, Daniel Solove, a professor at the George Washington University Law School, presciently observed that, “It is ever more possible to create an electronic collage that covers much of a person’s life—a life captured in records, a digital biography composed in the collective computer networks of the world.” There may be scenarios where that information could be put to good use, informing health professionals or quantified-self enthusiasts. But once the competitive dynamism of scoring enters the data mix, those positive uses become comparatively less important, and perhaps more distant. Spurious comparisons abound, correlation becomes mixed up with causation, and self-fulfilling prophecies are inevitable.
Policymakers should discourage the expansion of credit scoring into life scoring—or, at the very least, require disclosure of all the data and algorithms behind the scores to the people being scored. There needs to be a recognition that scoring can be “highly reductionist[,] atomizing complex, contingent relationships into simplified, one-dimensional measures that cannot provide a full and multidimensional picture” of individuals. It’s not necessarily an innovation to celebrate. Rather, it can be a prelude to the discrimination that’s rightly condemned. And before succumbing to the voyeuristic thrill of submitting friends or strangers to the consumer-facing versions of these scores, everyone should carefully consider the reliability of their sources.