While the packaging for these features is shiny and new, the substance behind them is not. Users could always use Maps while logged out of their Google profile and manually delete stored Echo messages by logging in to their Amazon account. The recently announced features are faster and require less searching through menus, but privacy and ethical-design experts are unimpressed by the measures. While useful, they argue, the new policies do very little to shift the needle.
Mona Sloane is a professor at the Tandon School of Engineering at New York University, where she researches ethical-design principles in engineering and artificial intelligence. The first problem with privacy features that can only pause, not stop, surveillance, she argues, is that users have to enable these features themselves. “Outsourcing that is a way of circumnavigating and avoiding responsibility,” Sloane says. “It’s a way of maintaining the core business model, which is generating as much data as possible.”
For both products, the default remains: You can pause tracking, but you can’t stop it. Users can’t tell Alexa never to store their Echo commands to begin with, nor can they preemptively tell Google to never track them while they’re logged in to the Maps app.
The new shortcuts seem targeted at a user base that’s fed up after two years of big-tech privacy breaches and surveillance scandals. In April, Amazon Echo users were shocked to find out that their recorded voices are collected and sent to human contract workers for analysis. Many Google Nest owners learned that their products shipped with functional microphones via a February tweet. And Facebook is still revamping its public image since news broke of Cambridge Analytica harvesting user data for a national influence campaign last year.
Read: Behind every robot is a human
Amazon’s and Google’s feature announcements, one privacy expert argues, are part of a Silicon Valley campaign to regain trust. “It’s privacy as a promotional tool,” says Albert Fox Cahn, the executive director of the Surveillance Technology Oversight Project at the Urban Justice Center. Cahn says that consumers’ perceptions of risk and reward when they’re shopping for smart products are changing. With each new scandal, wary shoppers become more convinced that it’s not worth wagering their privacy to use tracking software. These incognito-style moves are, in Cahn’s opinion, designed as damage control to get back in buyers’ good graces.
Cahn described shoppers’ new caution as an “existential threat” to tech companies. “They’re fearful of what lawmakers will do,” he says, “and they’re trying to win back our trust with these measures that create the illusion of privacy, but don’t threaten their core business model.”
Those business models, some scholars argue, often become tech companies’ ethics, shaping how they design their products. Ben Wagner, a professor at the Vienna University of Economics and Business, has found in his research that companies do embrace ethical principles after privacy backlashes. But those principles don’t necessarily adhere to the same values that consumers use. Wagner wrote about the risks of this disconnect in 2018, noting that firms regularly engage in “ethics washing” and “ethics shopping,” phrases borrowed from an earlier European Union report on governing AI.