Data gathered from those sources can end up feeding back into police systems, leading to a cycle of surveillance. “It becomes part of these big-data information flows that most people aren’t aware they’re captured in, but that can have really concrete impacts on opportunities,” Gilman says.
Once an arrest crops up on a person’s record, for example, it becomes much more difficult for that person to find a job, secure a loan, or rent a home. And that’s not necessarily because loan officers or hiring managers pass over applicants with arrest records—computer systems that whittle down tall stacks of resumes or loan applications will often weed some out based on run-ins with the police.
When big-data systems make predictions that cut people off from meaningful opportunities like these, they can violate the legal principle of presumed innocence, according to Ian Kerr, a professor and researcher of ethics, law, and technology at the University of Ottawa.
Outside the court system, “innocent until proven guilty” is upheld by people’s due-process rights, Kerr says: “A right to be heard, a right to participate in one’s hearing, a right to know what information is collected about me, and a right to challenge that information.” But when opaque data-driven decision-making takes over—what Kerr calls “algorithmic justice”—some of those rights begin to erode.
As a part of her teaching, Gilman runs clinics with her students to help people erase harmful arrest records from their files. She told me about one client she worked with, a homeless African-American male who had been arrested 14 times. His arrests, she said, were “typical of someone who doesn’t have a permanent home”—loitering, for example—and none led to convictions. She helped him file the relevant paperwork and got the arrests expunged.
But getting arrests off a person’s record doesn’t always make a difference. When arrests are successfully expunged, they disappear from the relevant state’s publicly searchable records database. But errors and old information can persist in other databases even when officially corrected. If an arrest record has already been shared with a private data broker, for example, the broker probably won’t get notified once the record is changed.
In cases like these, states are nominally following fair-information principles. They’re allowing people to see information gathered about them, and to correct mistakes or update records. But if the data lives on after an update, Kerr said, and “there’s no way of having any input or oversight of its actual subsequent use—it’s almost as though you didn’t do it.”
The pitfalls of big data have caught the eye of the Federal Trade Commission, which hosted a workshop on the topic in September. Participants discussed how big-data analysis can include or exclude certain groups, according to a report based on the workshop. Some commenters warned that algorithms can deny people opportunities “based on the actions of others.” In one example, a credit-card company lowered some customers’ credit limits because other people who had shopped at the same stores had a history of late payments.