Shadows of a panel appointed by Google to discuss Europeans' "right to be forgotten" in 2014Andrea Comas / Reuters

The Internet never forgets an embarrassing moment, or so it seems in an age when youthful indiscretions can come back to scuttle a job search or cause humiliation long after you’ve cleaned up your act. In 2014, a Canadian drama teacher lost her job at an all-boys school after several erotic films she’d made more than 40 years earlier appeared online. Never mind that the Internet was barely a gleam in the eye of computer scientists at the time.

In the U.S., where free speech is paramount, people who’ve looked to the legal system for help getting embarrassing information about themselves taken offline have mostly gotten a cold shoulder. European courts have been much more sympathetic, striking a balance between personal dignity and freedom of expression, writes Meg Leta Jones, an assistant professor of Communication, Culture, and Technology at Georgetown University, in her new book Ctrl+Z: The Right to Be Forgotten. Jones recently spoke with The Atlantic about this cultural divide and the philosophical and legal quandaries of forgiving and forgetting in the digital age.

Miller: Is there a philosophical tension between desire to put the past behind us and the desire to document things for posterity—and not edit the past?

Jones: There are two American ideologies here that are running straight into each other in the debate about the right to be forgotten. One is the idea that America is the land of second chances and reinvention, that we’ve built a country where this can happen. The other is this idea that not only do you have a right to express yourself and document things, but you also you have a right to know. Knowing what your government does and what your neighbors are doing, that’s part of being a good American citizen.

Miller: You write that the right to be forgotten is more firmly established in Europe. What does that look like?

Jones: There was a case in Spain in 2014 that spurred a lot of controversy. There was an individual who had filed for bankruptcy years earlier. Newspapers publish these notices when you sell your stuff as part of a bankruptcy, but it was coming up 10 years later when you searched for him on Google because they’d digitized their archives. This individual claimed it was old information and irrelevant to who he is today, but the paper refused to remove it from their archives and Google refused to remove the link from their search results. So, he went to the AEPD, the Spanish data protection agency, and they agreed that Google should remove the link. And they did—kicking and screaming.

In Europe, the default is that you better have a good reason to be processing personal information, and if you can’t point to it when someone objects, you’re going to have a hard time. It’s so un-American! The default in America is that of course you can and should share that information.

Miller: So, in Europe, if you don’t like something you see about yourself on the Internet all you have to do is call up your national data protection agency?

Jones: Oh, it’s so much easier than that. As a result of this Spanish case, Google has a form. You verify that you are a European Union citizen and then you literally copy and paste the urls that you want to be removed from a search result. The numbers are really quite staggering. So far, over 400,000 individuals have made requests to remove more than 1.4 million urls. The percentage of requests they’ve granted has hovered in the mid-40s. It’s a massive amount of information.

Miller: Have there been similar cases in the U.S. that were decided differently?

Jones: The second circuit [of the U.S. Court of Appeals] decided last year on a case that would have come down very differently in most European countries. A woman in Connecticut was arrested but never convicted. But when you do Google search on her name it still came up because a local newspaper had covered it. She was having a hard time finding a job. She sued for defamation. She said it was false information that the newspaper was providing. The second circuit said no, you were arrested. There’s no false information here for us to act on. We can’t help you.

The right to be forgotten in Europe was born out of the idea that people who served their time need to be integrated back into society and treated like human beings, and that meant not referring to them in relation to their criminal pasts.

Miller: You’d like to see us become more like Europe in this regard?

Jones: No, not necessarily. I do think the American political system should consider the right to be forgotten. There are a number of surveys that asked people if they’re concerned about this, and the numbers are roughly the same as they are in Europe.

Miller: Are there ways to do it that wouldn’t run afoul of the First Amendment?

Jones: We already have a handful of laws that limit the use of information in different contexts or for different categories of individuals. One that everyone knows is the Fair Credit Reporting Act. Credit agencies aren’t supposed to use information that is older than seven years. That’s for two main reasons. One is that it’s not considered good information, and the second is that we want to let those things age out of their record because it encourages them to have good financial behavior. Another example is juvenile criminal records. In lots of states, depending on the criminal activity, a child can petition the court to have their record edited, so that they go into adulthood with more opportunities than if they were labeled a criminal. In both examples the First Amendment has allowed for that because of the overwhelming social value.

Miller: In the book you write that information has a lifecycle, that its value changes with time. How is that relevant to this discussion?

Jones: As we start to crafting regulations around what information should be available we really need to have a way of talking about the value and harm to different parties over time. The point of describing an information lifecycle is so we can talk about whether we’d want a law that forces people to cull certain information as it ages, or apply robot.txt files [so search engines can’t find it], or anonymize it, or time stamp it so it doesn’t look new.

The whole conversation about the right to be forgotten has been really binary—as if digital information is permanent and the only options are to keep it or erase it. But digital information isn’t necessarily permanent, and we have tons of other options.