A pair of security experts I spoke to about Smith’s project were supportive of his search for another anonymizing technique—but wary of how he implemented it. Bruce Schneier, a fellow at Harvard’s Berkman Center and the author of Schneier on Security, warned against underestimating internet providers’ ability—and drive—to see through data-obfuscation tactics. “The question is, after 100 years of coding theory, how good are those algorithms at finding the signal in the noise?” he asked.
He hypothesized that a system masking a person’s browsing history by layering in copies of other people’s browsing patterns might be more useful. That way, the internet provider isn’t looking for a needle in a haystack, but instead is looking for one particular needle in a pile of other needles. “It would be a Tor-like system where anonymity comes through shared usage,” Schneier offered.
Kenneth White, a security researcher and co-director of the Open Crypto Audit Project, balked at the script’s use of random browsing. “As written, it is actively dangerous,” he wrote in an email. Smith’s program does use a blacklist to avoid visiting problematic websites, and avoids pages that are categorized as having to do with drugs, gambling, hacking, and porn, among other topics. Still, White worried, random Google searches could send the program down a dark rabbit hole, without the user’s knowledge.
“After crawling several hundred links, you may get an unwanted visit from law enforcement—defendable, no doubt, but not without some awkward explanations,” White said. He added: “It's an interesting idea, but this particular proof of concept is an academic/hobby exercise that adds more problems than it solves.”
Smith acknowledged the critiques, some of which he’s used to improve his script. He emphasized that he created the tool for his own use, and that anyone who wanted to use the script he posted on GitHub should consider the benefits and drawbacks. “I’m eating my own dog food, and I don’t want to run into trouble myself,” he said.
Smith’s technical background lends itself to the project: He specializes in “detecting difficult-to-find signals in noise,” so he’s well equipped to develop a program that would make it harder to do just that. But it has its limitations.
It’s not designed, for instance, to create plausible deniability, the way some obfuscation systems are. It doesn’t cover up sensitive web activity, which would remain easily accessible to an ISP that was looking for it. (That’s what Tor is good for.) And it’s not perfectly frictionless, either: Smith said his wife has noticed she’s often asked to fill in a CAPTCHA—an online quiz to prove that she’s human—when she visits Google, a measure designed to prevent bots like Smith’s from overloading Google with automated searches.
Smith and the security experts who reviewed his code all believe that users have to do more to protect their privacy, because internet providers won’t protect it for them. “We’re in an adversarial role with ISPs,” Smith said. Where appropriate, they say, use a VPN and Tor. Soon, more peer-reviewed privacy tools will likely appear that further hide, blur, or drown out your signal.