How Facebook’s Ad Tool Fails to Protect Civil Rights

The company’s platform lets advertisers exclude people of certain races from seeing their content. That’s a serious problem when it comes to promotions such as housing, credit, and jobs.

Several human minifigures stand in front of the Facebook logo
Dado Ruvic / Reuters

With more than 1.5 billion users, Facebook has become one of the most powerful, highly visible platforms on the internet. It’s no wonder then that so many advertisers clamor for space on the social-media site. But a recent investigation from ProPublica found that Facebook may be allowing those advertisers to discriminate based on race.

Facebook’s ability to let advertisers target a specific audience—for instance, women between the ages of 25 and 34 with young children—is its primary strength. More and more advertisers count on being able to identify, and market to, very specific groups. But Facebook’s advertising system not only allows marketers to choose who they most want to see their ads—it also allows them choose entire groups who will never see their ads.

When placing an ad on Facebook, advertisers can explicitly exclude lots of groups, including people with any given educational level, financial status, political affiliation, and—perhaps most disturbingly—“ethnic affinity.”

Facebook’s ad-targeting interface (The Atlantic)

“Targeting ads for housing, credit, or employment based upon race, gender, or sexual orientation violates the federal civil-rights laws that cover those fields—the Fair Housing Act, the Equal Credit Opportunity Act, and Title VII,” says Rachel Goodman, a lawyer at the American Civil Liberties Union. “If Facebook is going to allow advertisers to target ads toward or away from users based on these sensitive characteristics, it must at the very least prohibit targeting in these three areas central to economic prosperity.”  (A spokesperson from Facebook noted that the ad placed by ProPublica was not promoting a rental property, but promoting an event about renters’ rights.)

Steve Satterfield, Facebook’s privacy and public-policy manager, told ProPublica that any use of its advertising platform to intentionally discriminate is a violation of the site’s policy, saying, “We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law.” Satterfield also said that “ethnic affinity” is determined based on what sort of content on the site a user engages with most. That’s not the same as identifying a user’s race, but Facebook does place this category under an ad-targeting category called “demographics.” How accurately does “ethnic affinity” map onto users’ race? It’s hard to say for sure, but a recent report from the Pew Research center suggests it is not so difficult to determine a user’s race based on their use of the site.

The creation of an ad platform that allows marketers to carve out entire groups of people based on race or ethnicity isn’t unique. “There are a number of sites that allow you to specify based on race,” says Aaron Rieke, one of the heads of the tech-policy consulting firm Upturn. “But Facebook is special because it’s extraordinarily powerful.” The platform is extremely large, and that, Rieke says, means that it probably can’t review every single ad that is entered into its system. Still, the current system doesn’t have enough safeguards. For instance, relying on users to report whether or not they are being discriminated against is difficult if the discrimination is based on preventing someone from ever seeing an ad, he says.  Goodman agrees, saying, “When this kind of targeting happens online, it’s nearly impossible for people to know they’ve been denied information about opportunities they might be interested in.”

A better ad-buying platform might involve a system under which ads in areas where the U.S. has key civil-rights legislation—such as housing, credit, or employment—that also include ethnic targeting automatically get flagged for review. That type of due diligence already exists in the industry, Rieke says.

Granting advertisers the ability to ensure that minorities aren’t able to view certain ads is disturbing and potentially illegal. When ProPublica’s reporters described their experience placing a sample ad that would exclude blacks, they did so for a very specific reason: There is a law—the Fair Housing Act—expressly forbidding racial discrimination in housing. Among its many provisions, the act states that the following is illegal:

To make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin, or an intention to make any such preference, limitation, or discrimination.

This law exists for a reason—and it’s not just some quaint notion of fairness. Housing discrimination, particularly the long history of redlining, in this country has led to a vast and possibly permanent divide between white and black Americans when it comes to wealth, which stands at 13-to-one. Given that housing is the most valuable asset held by most Americans, exclusion from safe, affluent, white neighborhoods results not only in racial segregation, but also in black Americans remaining mired in concentrated poverty in neighborhoods with fewer services, lower tax bases, worse schools, and homes that didn’t grown in value.

A sign placed across from the Sojourner Truth housing project in Detroit, Michigan, during World War II (Getty)

Facebook’s claim that the intent of its audience-targeting tool wasn’t to allow for discrimination may well be true. But as a site that supposedly prides itself on inclusion, the company has a responsibility to ensure that advertisers cannot perpetuate discrimination right under its nose.