You encounter so many people every day, online and off-, that it is almost impossible to be alone. Now, thanks to computers, those people might not even be real. Pay a visit to the website This Person Does Not Exist: Every refresh of the page produces a new photograph of a human being—men, women, and children of every age and ethnic background, one after the other, on and on forever. But these aren’t photographs, it turns out, though they increasingly look like them. They are images created by a generative adversarial network, a type of machine-learning system that fashions new examples modeled after a set of specimens on which the system is trained. Piles of pictures of people in, images of humans who do not exist out.
It’s startling, at first. The images are detailed and entirely convincing: an icy-eyed toddler who might laugh or weep at any moment; a young woman concerned that her pores might show; that guy from your office. The site has fueled ongoing fears about how artificial intelligence might dupe, confuse, and generally wreak havoc on commerce, communication, and citizenship.
But are these people who don’t exist any different, really, from all the Tinder profiles on which you swiped left, or the faces in the crowd on the subway whom you might never see again? Modernity—the historical period roughly but not exactly contemporaneous with the rise of industrial societies—invented anonymity and erasure, mustering sorties of human faces at one another every day. Contemporary individuals have trained all their lives to treat people in exactly this instrumental way—not only the strangers on city streets, but also the models in the photos that grace IT-solutions banners inside airport terminals, the youth of all skin shades draped across college quads on application mailers, the baristas who hand over one-Splenda soy lattes with names misspelled on the cups.