A data breach at the credit-reporting firm Equifax disclosed last month—a hack that affected an estimated 145.5 million Americans—cost the company’s CEO, Richard Smith, his job. And, because of many American employers’ hiring practices, the hack could cost many others jobs as well.

The consequences of the hack will probably be felt by its victims for years: With this trove of sensitive personal information, the hackers have an unprecedented opportunity to commit identity theft, signing up for credit cards and loans under other people’s names. Any such fraudulent accounts will appear on victims’ credit reports, and when those accounts fall into delinquency, it will look like victims have failed to pay off their debts.

This will likely make everyday financial activities—say, applying for a new credit card—more time-consuming and difficult for countless millions. It’s also going to affect something perhaps even more consequential: people’s job searches. That’s because, in the estimation of the Society for Human Resource Management, about half of American employers at least sometimes consider credit history when deciding who to hire (though nailing down a precise figure is not possible with the data that are out there). Some companies only run credit checks when an employee would handle money or sensitive information, or serve as a senior manager. But other firms run credit checks for all job positions, and while they do have to ask applicants’ permission before running a credit check, they can also refuse to hire anyone who says no. It’s a practice that had numerous shortcomings before the hack, and now those downsides will likely be exacerbated.

Credit bureaus and background-check companies, including Equifax and others,  tend to argue that applicants’ credit history sheds light on their trustworthiness and character. Employers’ embrace of credit reports can be understandable: Predicting how a candidate will behave if hired is a very tricky task, and who wouldn’t want more information?

But opponents of the practice argue that bad credit just as often reflects bad luck, such as having to deal with a medical emergency or job loss, and that by denying jobs to people with bad credit, employers exacerbate economic misfortune. On top of that, even before the Equifax breach, there was a high rate of mistakes in credit reports—one in four contains errors, according to the Federal Trade Commission. So credit reports are, at best, faulty barometers of an applicant’s dependability.

Over the past few years, I’ve interviewed employers about how they use credit reports in hiring, and there are deeper reasons for concern. My research shows that employers lack hard evidence about how credit data relate to workplace behavior, and so to draw conclusions from delinquent debt, they often try to figure out what has happened in a job candidate’s life and whether those events justify bad credit. This means that to get a job, candidates may have to discuss their medical history, marital status, or other topics generally considered taboo in hiring. As one might imagine, job candidates are at times fairly unhappy about needing to disclose such personal details as part of a job interview.

Though in a way, that might be exactly what opponents of credit checks want: employers taking the time to parse candidates’ credit history to determine whether it says something about them or about their situation, and then hiring people who aren’t deemed at fault. The problem is that what one employer considers a legitimate reason for bad credit isn’t necessarily what the next one does. In my research, I’ve found that some think the recently divorced should be judged less harshly, but others don’t. Some think a bankruptcy demonstrates abdication of personal responsibility, while others think it indicates sagely following a lawyer’s advice. Employers can make assumptions about people’s spending based on the reports, and some think defaulting on a Best Buy credit card marks the height of irresponsible spending, while others feel that way about defaulting on a Victoria’s Secret card.  

More disturbingly, my research suggests that the moral distinctions employers draw vary according to their own life experiences—carrying student loans, say, makes one more empathetic to candidates struggling to pay off their education debt—as well as according to their class, gender, and perhaps even race. Whether a candidate with bad credit gets a job depends not only on credit-history details and how the candidate explains them, but also on the assumptions made by the people in charge of hiring. What appears on the surface to be a system based on objective data is really one based on human biases.

The problems with relying on credit reports extend to the way inaccuracies in them are often addressed. Employers generally put the burden on candidates to prove that delinquencies are mistakes and at times even require credit reports to be corrected, a task that can be Kafkaesque in the case of identity theft and which more people will likely face in the wake of the Equifax hack. Based on my research, employers generally seemed unconcerned that fixing a credit report can easily take weeks or months, much longer than most job positions remain open.

That employers sometimes request corrections reflects a seldom-discussed aspect of the practice: Credit checks carry important symbolic value to those keeping tabs on the people in charge of a company’s hiring. In my interviews, a number of hiring professionals talked about pulling credit reports in order to demonstrate to government regulators, investors, business partners, and customers that they were following sound hiring practices. In these cases, whether a credit report said anything useful about a job candidate was less important than the perception that workers had been appropriately vetted.

Furthermore, some job seekers self-select out of jobs that require a credit check. Companies often say up-front that they run credit checks, but not that candidates with bad credit will get a chance to explain the circumstances. To some people, applying therefore feels like a waste of time—even though their story might have passed muster.

Is anything being done to curtail the use of credit checks in hiring, especially now that the Equifax hack is poised to make its bad effects worse? A week after the breach was made public, Congressman Steve Cohen and Senator Elizabeth Warren introduced bills to curb the use of credit checks in hiring, an effort that had already been gaining traction at the state and local level; since 2007, more than a dozen states and cities have passed laws curbing employer credit checks. Action at the federal level has been slower: Bills similar to Cohen’s and Warren’s latest have been introduced in Congress many times. That said, outrage over the Equifax hack might improve the prospects of such legislation passing.

One unknown, though, is what sort of screening devices companies might move onto if deprived of credit reports. In 1988, Congress banned most employers from using polygraph tests, the one-time go-to for establishing which candidates could be trusted. But after that ban went into effect, credit bureaus started marketing credit checks as an alternative, prompting companies to adopt them en masse.

Policymakers, then, are left with a tough set of choices. First, they could leave in place a biased system that exacerbates economic disadvantage. Second, they could ban or deeply curtail the use of employment credit checks, potentially opening the door to other unpalatable hiring practices that haven’t yet been deeply explored. Or third, they could get rid of credit checks and work with employers and regulators to devise a system that doesn’t place the full burden of risk management on individual job candidates.

What would such a system look like? The use of credit checks assumes that malfeasance can be avoided by weeding out inherently untrustworthy individuals, but employers that don’t use credit checks often work on creating environments where people aren’t as tempted to misbehave, such as by having multiple employees sign off on large purchases and by conducting intensive ethics training.

That last option definitely seems like the hardest of the three—but also perhaps the one which best avoids unintended consequences and underscores that employers should be focused on who can do the job, and nothing more.