When Amy Webb broke her ankle, she was forced to hobble around on a walking boot. That inconvenience spawned others: among them, she couldn’t pass through the metal detector at airport TSA PreCheck lines any longer. Instead, she had to use the backscatter machines that produce X-ray images of passengers.
Webb, who is a professor at New York University and the author of The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity, took the inconvenience as a firsthand opportunity to watch how this technology, which uses computational methods to mark possible risks on the body, really works. “I’m looking at the screen,” she says of the image that appeared from her scan, “and my cast, head, and breasts were big blocks of yellow.” While waiting for the ensuing pat-down, she watched a couple of other women go through. Same thing: blocks of yellow across their breasts.
It was because of underwire bras, she later learned, which the system sometimes can’t distinguish from potential weapons. She’s had other problems with the machines, too, including that her mop of thick, curly hair sometimes confuses them. (My colleague Hannah Giorgis, who also has a lot of curly hair, confirms that she, too, suffers a cranial pat-down every time she goes to the airport.)
Webb’s experience is among the more innocuous consequences of computer systems that don’t anticipate all the types of people who might use them. Computers have started issuing prison sentences, for example.
Webb says her airport experience can be traced back to the fact that “someone like me wasn’t in the room” when the system was designed, or when it was trained on images of human forms, or when it was tested before rollout. That idea echoes a popular suggestion to remedy computers’ ignorance of different sorts of people: Increase the diversity of representation among the people who make these systems, and they will serve the population better.
But that’s an aspirational hope. Tech-industry diversity is improving, but it’s still pretty terrible. Women, black, and Latinx representation is particularly poor. That makes diversity a necessary but insufficient solution to social equity in computing systems.
For years, companies and educators in the tech sector have framed diversity as a “pipeline” problem. The people with the right educational background get access to the right training, which gets them into the right college, which connects them to the top employers. Fixing the flow of talent into this system, the thinking goes, will produce the workforce that Webb and others are calling for.
Among them is the Constellations Center for Equity in Computing at Georgia Tech, where I hold faculty positions in the colleges of computing and liberal arts. The center’s goal is to increase access to computer-science education among women and people of color. Among its activities, it has funded and supported computer-science classes in Atlanta public schools, especially in primarily black neighborhoods without prior access to those opportunities.
Those efforts have merit. But their impact might be a drop in the bucket, given the size and composition of the tech industry. At Google, for example, more than 95 percent of technical workers are white or Asian. Adding more black engineers from Atlanta schools to that mix will certainly help push the numbers up incrementally. It will also give more people of color access to the economic opportunities the tech industry offers. But there’s a risk of tokenization; inviting a black man or a curly-haired woman into the room could make a difference in the design of the systems that produced Webb’s experience at airport security. But it probably won’t substantially change the thrust of the tech industry as it currently operates.
Even though she’d like to see more diversity among tech workers, Webb blames educational efforts like those that Constellations is pursuing for the current state of affairs, at least in part. “We’ve had this obsession with STEM education,” she said yesterday during a panel at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic. “It’s reached fever pitch, manifested in these programs where every kid has to learn to code.” Webb worries that the drive to make more computer professionals for diversity’s sake might create more problems than it solves. If everyone is focused on the nuts and bolts of making software quickly at scale, where will they learn to design it with equity and care? “Critical thinking is what the computers won’t be able to do,” she said.
Webb points to China as an alternative. There, kindergarten-age students nationwide will begin studying a textbook this year that’s designed to teach students the new basics of knowledge they need to succeed in a computational future. “That’s the foundations of the kind of thinking that will allow them to work in conjunction with AI systems,” she concluded. “While everyone’s fixated on ‘Every kid must code,’ we are risking forgetting that every kid must learn to read and write.”
Charles Isbell, the executive director of Constellations and the incoming dean of computing at Georgia Tech, still sees computing education as a necessary step. “The real question,” he told me, “is: Are we interested in diversity, or are we interested in integration?” The integration of women, people of color, and other underrepresented voices would mean that the behavior of the entire industry would change as a result of their presence in that community. “Diversity is just membership,” Isbell said. “Integration is influence, power, and partnership.”
But integration is much harder than diversity. Isbell thinks that two separate conditions need to be met in order to accomplish it: “One is that the new folks are both capable and confident. The other is that the old folks are willing.”
Kamau Bobb, the global lead for diversity research and strategy at Google and a senior director at Constellations, isn’t so sure the tech industry is willing yet. A lot of people are involved in diversity, equity, and inclusion programs in Silicon Valley, and “those people are really committed,” Bobb told me. But their motivation is largely driven by providing access to the existing state of affairs. “They’re compelled by the argument that it just isn’t fair that more people don’t have access to the Google life—the free food and the power and the money,” Bobb said. Their goal is to get more people in the game, not necessarily to change the rules of that game. In this line of thinking, inclusion is first a problem of economic equity; any resulting social or moral benefits would just be gravy.
But for technical systems to take everyone into account, Isbell contends that representation must shift from an economic imperative to a moral one. “First you make the economic argument, and that’s where the industry is now, mostly,” he said. “Then you make the moral argument. That’s where you want to be. Until you win the second argument, you haven’t won.”
In Webb’s view, that argument is unlikely to ever gain traction among big, wealthy tech companies. “A moral imperative is unlikely to motivate public companies,” she told me. Bobb agrees—Google’s focus on the “next billion users” entails a better understanding of people of color, he said, but only because the company finally understands that they represent an untapped market for advertising.
But to Webb, that doesn’t mean those companies are hopeless. The problem, she said, is that scale, market share, and speed matter more than anything else. She believes the problems that arise in computational social infrastructure, such as backscatter X-ray devices and facial-recognition systems, are caused by the ferocious competition between these companies. Webb thinks a better approach to solving the social ills in artificial-intelligence systems would come from some kind of federal office or consortium that might encourage collaboration between tech firms; one such project could be revising data sets that don’t fully represent the general public.
For Webb, the underrepresentation of women, black people, and others is a real problem, but it’s not the fundamental one. “We’re all discriminated against by computing,” she insisted. Computing professionals constitute a “tribe,” separated from the general public not primarily by virtue of their race, gender, or nationality, but by the exclusive culture of computing education and industry. That culture replaces all knowledge and interests with the pursuit of technological solutions at maximum speed. “Anyone who falls outside of that core group of interests are not being represented,” Webb said. If she’s right, then the problem with computing isn’t just that it doesn’t represent a diverse public’s needs. Instead, the problem with computing is computing.