Many privacy regulations, including the European Union’s General Data Protection Regulation, include a right to request that your data be deleted. But that right doesn’t apply when your data are used to train AI and machine-learning systems, or when your face is added to a facial-recognition data set without your knowledge.
Read: The strange way that people perceive privacy online
Just last month, Microsoft took down a publicly available data set of 10 million face photos, mostly collected without user consent or notice. Duke University and Stanford University also took down large data sets of face and body images that were mostly taken without notice or consent. The people whose images were included in these data sets probably didn’t know that their faces were being used to train military systems in China.
Facial recognition is only the tip of the iceberg. License-plate readers, shopping beacons, and a whole suite of mobile trackers follow individuals both online and offline. Amazon Alexa and Google Home devices listen to everything you say. Roombas map your floor plans. Strava and Fitbit record your location. 23andMe and Ancestry.com collect and essentially own your very sensitive genetic information. Our current laws cannot handle the sheer amount of data collected on individuals in what the Harvard professor Shoshanna Zuboff calls the “age of surveillance capitalism.”
We need better privacy laws that address the new harms and risks arising from facial recognition, AI and machine learning, and other technological advances. To deal with privacy risks in the larger data ecosystem, we need to regulate how data brokers can use the personal information they obtain. We need safeguards against the practical harms that invasions of privacy can cause; that could mean, for example, limiting the use of facial-recognition algorithms for predictive policing. We also need laws that give individuals power over data they have not voluntarily submitted.
We can’t have effective laws until we expand our understanding of privacy to reflect the data-hungry world we now live in. The FaceApp privacy controversy is not overblown, but some attacks are misdirected. The problem isn’t photo-editing apps or third-party developers or Russian tech companies. What we are facing as a society is a systemic failure to protect privacy when new technologies force our preconceived notions of privacy to collapse.