The Atlantic

As general counsel of the National Security Agency in the 1990s, Stewart Baker advocated for limiting the government’s intelligence-gathering powers in the name of civil liberties. Then the 9/11 attacks happened, and Baker concluded that the limits he’d supported contributed to the security lapse. So he began pushing the case for surveillance—first while serving as a member of the George W. Bush administration, then by writing op-eds, hosting podcasts, and sparring with opponents who believe that his proposals endanger fundamental rights. As the country struggles to contain the coronavirus, he thinks that many Americans will experience a conversion like his, becoming more willing to make sacrifices to their privacy. He captured the mood of the pro-surveillance camp in the title of a recent podcast episode: “Is Privacy in Pandemics Like Atheism in Foxholes?”

Baker and other surveillance proponents say that controversial measures could save lives. Baker particularly supports the idea that the government should be able to access contact data recorded by your smartphone’s Bluetooth signal and, when necessary, location data recorded by its GPS. Tech companies, meanwhile, have reportedly proposed using thermal-imaging sensors and deploying security cameras with facial-recognition software to track infected people and their contacts. These ideas are meeting resistance from privacy advocates and reigniting a debate about surveillance and civil liberties that gained momentum after the September 11 attacks. Baker, who was also an early member of the Department of Homeland Security, told me he believes the attacks showed that an overcommitment to privacy left the country exposed; afterward, the balance tilted heavily toward surveillance, leading to practices such as warrantless wiretapping and the bulk collection of phone records. But his opponents have a different view of what happened after 9/11: The government leveraged public fears to expand its power, many new surveillance measures did little to make the country safer, and the rights Americans gave up have been hard to win back.

To get a sense of this renewed debate, I spoke with Baker and others who have spent years trying to resolve the trade-offs between the nation’s values and security in the context of counterterrorism, and who are now considering the same dilemma in light of the pandemic. All agreed that now is the time for Americans to have the difficult discussion about what changes they want to see and how far they want them to go. Terrorism and pandemics are two very different problems, to be sure, but there are lessons to be learned from how the country responded the last time it faced a defining crisis.

There are already signs that the government is considering expanding its reach in response to the pandemic. The Justice Department has petitioned Congress for new emergency powers, including the ability to ask judges to detain people indefinitely without trial. Donald Trump has clamped down at the southern border and restricted asylum claims in the name of public health; on Monday, he announced that he will use an executive order to temporarily suspend immigration to the U.S. He publicly considered mandating a quarantine for the New York region, though he later walked back from that idea. He then declared that he has “total authority” over states in lifting social-distancing restrictions, before backtracking again. Digital surveillance, meanwhile, has quickly become a major focus for the government and tech companies alike; as my colleague Derek Thompson has explained, proponents of location-tracking and contact-tracing technology in particular believe that it could help end the pandemic more quickly. But even if that’s the case—and some cybersecurity experts contend that the utility of this sort of tracking during pandemics is far from clear—such technology still raises significant concerns.

These concerns are expressed by people such as Douglas London, who worked at the CIA for three decades before retiring last year, and whose career dealing with counterterrorism issues has led him to regard government surveillance with more skepticism than advocates such as Baker. He told me that he can envision a scenario in which the government proposes “a PATRIOT Act for pandemic monitoring and control”—a reference to the law enacted after 9/11 that gave the government more powers to fight terrorism while also laying the groundwork for sprawling new surveillance. “The invasion of personal privacy would offer the government strong technical surveillance tools to help in containing and rolling back a pandemic,” said London, who now teaches at Georgetown University. “But I think Americans should resist such measures. Once you give away those rights and privacies, you’re never going to get them back. And once the government has these powers, they can be used for other things, and they can be abused.”

The virus has ushered in an expansion of government surveillance in other countries. China used facial-recognition software and the police state’s ubiquitous security cameras, along with a mandatory tracking app, in its containment efforts. (Russia has reportedly turned to facial recognition as well.) In South Korea, investigators combined location data from smartphones with security footage and records of credit-card transactions in attempts to determine who may have been exposed. As Thompson noted in his article, the information the government compiled included a person’s “last name, sex, age, district of residence, and credit-card history, with a minute-to-minute record of their comings and goings from various local businesses.” In Israel, the government deployed a previously secret counterterrorism program that tracks a person’s location via his or her phone.

Baker, in a recent article, proposed that America follow the lead of Singapore, which asked citizens to download a contact-tracing app developed for the pandemic. That app records a list of smartphones that pass within range of a user’s Bluetooth signal; then, if someone is infected, the government accesses that list and contacts the people on it, telling them that they might have been exposed to the virus. In the United States, such an app would raise objections from privacy advocates because it gives the government access to information about a person’s interactions. The app’s effectiveness would also be dependent on the extent to which Americans would use it. (Singaporean officials have said that three-quarters of the population would need to use the app for it to be effective and caution that they’re far from achieving widespread use.) Baker argued that, thanks to emergency statutes written after 9/11, most state governors have the power to seize communication devices during a public-health crisis. This could apply, he said, to the software that runs on personal smartphones. Governors could theoretically push, or even require, Google and Apple to automatically download the app onto residents’ phones and send a notification asking them to activate it, Baker wrote. With these emergency powers, governors could even mandate that people use it, he added. Such a move has no legal precedent, Baker told me, but he asked: “Which is a bigger intrusion on liberty: requiring that we all stay home and lose our jobs, or requiring that we add [such] an app to our phones?”

When I asked him about the privacy concerns surrounding his proposal, and the idea of letting the government access location data during the pandemic more generally, he said that ship has sailed, because technology and advertising companies already collect so much location data in order to monetize it. (Many claim that these data are anonymized, but journalists have shown that to be wishful thinking.) “Nowhere you go, at any time, is your location any longer private. It is in the hands of multiple advertisers and people who are going to use that for the purposes of trying to sell you stuff,” Baker said. People can argue that private companies having this information is different from the government having it. But, he countered, that conversation is “now deep into a kind of bargaining over location privacy as opposed to saying that your location is sacrosanct.”

Other veterans of the surveillance debate have surprised themselves with some of the changes they’re proposing. Alan Rozenshtein, a law professor who worked in the Justice Department’s national-security division during the Obama administration, told me that he thinks new government location-surveillance powers are warranted so long as authorities can show they’re effective. So too, he said, is the government’s ability to use and possibly even require thermal-imaging cameras in public places, as well as  an expansion of its use of facial-recognition software. “As someone who has been studying 9/11 and the U.S. response to it, I see this as the same kind of inflection point,” Rozenshtein said. “I just don’t see us going back to normal ever again after this.”

But he’s uneasy with where this logic takes him. “I myself am made somewhat uncomfortable by my own arguments,” he said, because of the potential for government officials and private companies to take advantage of the emergency, and for surveillance measures enacted in the name of fighting pandemics to be put to other uses. Since 9/11, for example, information acquired via surveillance on national-security grounds has been used to prosecute drug crimes, food-stamp and mortgage fraud, and lying on bank statements. Conversations recorded by an Amazon Echo and heart-rate data tracked by a Fitbit have been used in criminal investigations. “There really is such a thing as surveillance creep, and surveillance programs do tend to increase beyond their initial scope,” Rozenshtein said. “Pandemics, like other emergencies, have often been these catalyst moments for the permanent expansion of the government. And the government does not tend to shrink after the moment has passed.”

A key distinction between counterterrorism surveillance and measures designed to counter pandemics is the targets. The former is directed primarily at foreign nationals and organizations, whereas the latter would be trained specifically on Americans. That element of domestic surveillance is what most concerns Klon Kitchen, a senior fellow at the Heritage Foundation who spent more than 15 years as a U.S. intelligence officer focused on counterterrorism. He told me that he’s open to the idea of expanded surveillance to fight the pandemic—but only if it’s conditioned on rigorous constraints and oversight. When I asked whether he thinks the U.S. government is capable of imposing these limits, he paused and sighed. “Look, it’s the nature of those who surveil to always want more,” he said. “And when we ask the government to surveil, we give it a mandate and we stoke that appetite.”

Since early March, new proposals for expanded digital surveillance have been “coming fast and furious,” Cindy Cohn, the executive director of the Electronic Frontier Foundation, a digital-privacy group, told me.

A report in The Wall Street Journal showed that data-mining firms already have contracts with the Centers for Disease Control and Prevention and the National Institutes of Health; Facebook and Google are in discussions with the U.S. government about using location data to help track the outbreak; a location-tracking start-up founded by former government officials has been in talks with the White House; and a facial-recognition company is exploring work with state agencies. Other companies have been marketing thermal-imaging cameras to the government that, while of dubious utility, raise further privacy issues. Big Tech companies were already fighting to preserve their ability to collect and monetize massive amounts of consumer data. Law-enforcement agencies were already testing controversial facial-recognition software. Cohn worries that the crisis could make these practices seem more palatable to government officials, lawmakers, and the public if they’re rebranded as measures to ensure public health.

The current climate is similar to the one that followed the 9/11 attacks, she said: “It gives a lot of opportunity for people to profiteer, for people to take advantage of the crisis to do things that are not necessarily in the public interest, and for people to just make mistakes.”

Many Americans may feel inclined to shrug off the privacy concerns surrounding these tools right now if they help alleviate the crisis, but Cohn counters that some of the most invasive surveillance measures the government enacted after 9/11 were not all that effective at stopping terrorism. The NSA program that collected Americans’ phone records in bulk was deemed by a federal oversight board to be of “minimal” help in counterterrorism. Three controversial domestic spying programs once billed as crucial are in limbo after Congress allowed them to lapse last month. In that vein, Cohn and others argue that the use of location data in pandemics might not live up to proponents’ promises. GPS data are not accurate enough to tell whether someone has been within, say, six feet of an infected person. Bluetooth signals can cut through car doors and walls. Time and resources might be wasted on false leads.

Years of progressive encroachments on privacy have numbed many Americans, and some even consider them fair trade-offs for the convenience and security of modern life. But the erosion of privacy weakens a democracy, those such as Cohn argue, leaving people feeling as if they’ve lost control not just over their government, but over their personal life—and the ability to think, act, and communicate without the expectation that someone is watching or listening is fundamental to a thriving democracy.

A tragic irony in Trump’s response to the pandemic is that his failure to take actions early on forced the government to adopt measures such as stay-at-home orders, the closing of businesses, and bans on social gatherings and religious services, Stephen Vladeck, a law professor at the University of Texas who focuses on national security, told me. These failures raise a question, he said, “that I think is going to haunt us: Would those measures have been necessary no matter what, or are they necessary because we didn’t use [less extreme] powers when we should have?”

Trump could have mandated that private companies produce extra personal protective equipment and ventilators weeks or even months earlier, Vladeck noted, and released millions of dollars in emergency federal funds. He failed to take advantage of the CDC’s broad authorities to quarantine and isolate travelers even as people from coronavirus hot spots around the world continued to fly into the United States. And, of course, the Trump administration failed in the most basic and crucial element of providing early and widespread testing, even after the secretary for health and human services declared a public-health emergency in late January.

After 9/11, a similar lack of leadership from the White House left decision makers across the government scrambling. This paved the way for much of the abuse and overreach that followed. The most glaring missteps are well known: the Iraq War, torture, Guantánamo Bay.

In the background, U.S. troops and policy makers were driven by a sense of urgency to do something—anything—to protect people from further harm. Stanley McChrystal, the retired general who made his name hunting al-Qaeda operatives after 9/11, recalled the feeling. “You have this emotional, do-something response, as we saw with 9/11. What do I do about it? Who do I kill? Who do I protect? There’s a frustration that comes with not being able to get your hands right on it,” McChrystal told me. “There’s always this slide toward the expedient. Yet at a certain point on the slide, you can’t go back up the slope.”

McChrystal sees similar dilemmas now and counts himself among those who find proposals for new surveillance powers compelling. “If someone is a threat to society—for example, if they’re infected, whether they don’t know it or they do know it and are choosing not to modify their behavior—then I think society knowing where they are and being able to do something about it is a pretty reasonable proposition,” he said. “My heart is telling me that you’re talking about, potentially, the protection of millions of people.”

Weighing the trade-offs, he added, “requires exceptionally mature leadership.”

But it doesn’t just fall to the leadership to find the right balance.

As an FBI agent tracking al-Qaeda after the 9/11 attacks, Ali Soufan became known for showing that sticking to U.S. values did not require sacrificing the nation’s safety, shunning the abusive practices used by other interrogators, which often elicited bad information. He gained crucial intelligence by getting to know prisoners, famously making one major breakthrough by engaging a detainee in an interrogation that included a theological debate over a meal and tea. And he told me that this is a time for Americans to double down on their values and use the crisis to become more unified, to reestablish global leadership, and to recommit to national service.

The risk is getting distracted—much as America did after 9/11—from addressing the real issues that led to the problem in the first place. “What do you think doctors and nurses need today? Do they need facial recognition, or do they need masks?” Soufan asks. “We need more hospital beds, not more smart cameras surveilling people. We need more scientists. We need an international system that can deal with this kind of problem.”

So can America find a way to strike that balance between its values and its safety?

Joshua Geltzer, who was a senior director for counterterrorism in the Obama White House, believes it can. We should follow the lead of scientists to determine what, if any, new surveillance techniques would be truly useful, he told me. Before enacting them, the government and Congress should set up a rigorous transparency and oversight regime. And they should make any new surveillance powers temporary. “It’s a question of calibration,” said Geltzer, who now teaches law at Georgetown.

Meanwhile, new proposals offer ways to monitor the pandemic without involving the government. The game developer Nicky Case recently published a summary of location-monitoring technology that “can foil both COVID-19 and Big Brother.” Using your phone’s Bluetooth signal, an app would broadcast a series of uniquely random codes while recording those of other nearby users. If you test positive for the virus, you tell the app, which automatically alerts anyone whose phone recorded your randomized codes. Since these codes contain no identifying information, your privacy remains protected. Apple and Google announced this month that they’re developing new tools that could make such an app possible, saying the technology would focus on the sort of anonymized codes that Case laid out.

But any contact-tracing technology still relies on the sort of widespread adoption required by the Singapore app. Nicholas Christakis, a pioneering sociologist and professor at Yale, has a different kind of tool in mind.

He and a team of researchers and developers are rapidly working to finalize a new app for tracking the pandemic with features that could alleviate privacy concerns. (He told me that he worries about the possible erosion of privacy in response to the pandemic, and that “the union of this modern technology and the surveillance state is very concerning.”) His app, called Hunala, is based on his research in network science, which includes the principle that your friends are generally more well connected, with more contacts, than you are. The app, which will be available for download by the end of the month, “is opt-in, and it’s anonymized,” he said. Once people download it, they’ll be asked to nominate some of their friends, who then receive a message asking them to download it too. Users volunteer to report whether they have a fever or any other COVID-19 symptoms. They also note their location when they log this data, though it is not tracked throughout the day. The team of scientists behind the app can monitor this information, provide early notice as to where new cases are cropping up, and alert public-health officials to be ready, and people can use the information to avoid areas with infections.

Christakis said the app could predict an outbreak two or three weeks before health officials would typically notice it otherwise. (His team used a similar concept to successfully track the H1N1 outbreak at Harvard in 2009.) The network science behind the app means it would be successful for its objectives at much lower adoption rates than those required for contact-tracing apps. It could significantly mitigate the spread of the virus once lockdown orders are lifted or if a second wave comes, Christakis said, “and it doesn’t involve the government knowing where we are at all times.”

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.