Sergio G. Cañizares / Getty

On Tuesday, joining the dozens of lawmakers asking Mark Zuckerberg about Facebook’s data practices, Senator Ed Markey of Massachusetts posed a question: “Would you support a child online-privacy bill of rights for kids under 16 to guarantee that that information is not reused for any other purpose without explicit permission from the parents for the kids?”

Zuckerberg said he agreed on the “general principle” but wasn’t sure a new law was necessary. “I think that this is certainly a thing that deserves a lot of discussion,” he said.

The exchange marks a period of ever-increasing usage of devices and social media among kids, inviting scrutiny of how kids use the internet, and how the internet uses kids. Parents’ concerns about data are encoded in the existing law that Markey alluded to expanding, but they also worry about the content that kids stumble across on big platforms. There’s no shortage of constructive, good content for kids on the web, but no one, including parents, has figured out a good way for kids to be online, even if the big tech companies say they have. There are simply too many threats out there—disturbing videos, bullies, data exploitation—and the current regulatory scheme is built on a skewed picture of how parents, kids, and companies interact with each other.

That scheme has a name: the Children’s Online Privacy Protection Act (COPPA), passed in 1998. Twenty years later, its central concerns have held up well. Sites meant for kids under 13, or that have “actual knowledge” that those kids use a given site, have to get explicit parental consent to collect identifying information about them (and thus also to use that information for targeted advertising). “Children, in the United States, are the only group of internet users who have opt-in rights” for how companies use their data, Jeff Chester, an early advocate of the law and the executive director of the Center for Digital Democracy, a consumer-advocacy group, told me. “That's the basic framework of COPPA.” (What Markey was proposing would bump COPPA-style parental consent up to age 16.)

But on Monday, Chester’s organization and 22 other consumer-advocacy groups filed a complaint with the Federal Trade Commission alleging that Google’s platform YouTube has violated that framework because it has “actual knowledge” that kids under 13 use the site and nonetheless harvests kids’ phone numbers, geolocation data, and other unique identifiers as it would any older user. “We are reviewing the complaint and will evaluate if there are things we can do to improve,” a Google spokesperson told me via email, repeating a statement issued to the press earlier this week. “Protecting kids and families has always been a top priority for us.”

Sites like YouTube and Facebook sidestep COPPA by having users certify, when making an account or logging on, that they are 13 or older. But that doesn’t always align with reality, Ariel Fox Johnson, the senior counsel for policy and privacy at Common Sense Media, an advocacy group that joined the complaint, told me: Lots of kids get past that safeguard. (It would make sense that tech companies are less than scrupulous about this, given how dependent their business models are on growing user bases and collecting that data for advertising.)

YouTube has countless channels “that are very clearly targeting kids,” Fox Johnson said. “For them to say ‘We're not targeted toward kids on those channels’ is a little bit hard to take.” Indeed, YouTube is very popular with kids, two-thirds of whom in the 6-to-12 age range already have a personal device. Roughly three-quarters of kids in that age group use YouTube daily, according to a separate market-research study cited in the complaint. And usage among kids is climbing, fast: Sixty-five percent of kids who use it hop on several times a day, up from 45 percent in 2015.

For these reasons and many more, tech companies have cordoned off separate spaces for children: Facebook announced Messenger Kids in December, for example, because parents “see value in these technologies,” Antigone Davis, the global head of safety at Facebook, told me, and “they want more control.” And “because YouTube is not for children,” the Google spokesperson said, the company built YouTube Kids three years ago “to offer an alternative specifically designed” for them.

But while those platforms offer parents more control, are often freer from ads, and insulate companies from a certain degree of liability, they’ve attracted controversies of their own. Many of the same advocacy groups criticizing YouTube are pressuring Facebook to shut down its new app, saying kids are too young to start navigating social media that most adults can barely handle themselves. Davis told me Messenger Kids will keep operating, but “we are continuing to do research in this area. Parents rightfully think they are in the best position to determine the needs and manage their children’s online experiences.” Facebook’s role, as she sees it, “is providing them with research-based information and guidance, and the tools that they need to control that experience for their children.”

YouTube Kids, too, has been criticized for hosting disturbing videos that had crept from its main site past its algorithms: Kids have stumbled across videos of Mickey Mouse covered in blood, characters from the animated series PAW Patrol committing suicide, or of a young girl apparently treated roughly and bleeding. While the videos account for a tiny fraction of the content the app’s 11 million weekly users see, they have drawn attention to the monumental task of sorting content with automated filters. And in any case, while YouTube Kids is booming, the same media-research survey of 8,200 kids and parents found that its parent site is still the “most powerful brand in kids’ lives.” (Google declined to comment on the details of the study.)

And that’s just for the platforms where kids belong. It’s not much easier to control how children get on the platforms where they’re not supposed to be, and thus, how their data sneaks past COPPA protections. “There’s huge incentives for [kids] and families to lie about age”—as in, confirming that kids are 13 or older when they’re not—“so even the data [about users that companies keep] gets corrupted,” Mimi Ito, a cultural anthropologist and professor at the University of California, Irvine, told me. Kids (and parents) want engaging, cheap, or free videos regardless of how old they are; in practice, Ito said, many children start navigating the web on their own, without parental consent, around age 10.

“If you’re a for-profit that provides free services or content that are of value to kids under 13, then it’s very difficult to be successful and to be COPPA-compliant,” Ito said. “And that’s where you’re seeing families routing around the policy as much as the companies are.” It’s simply not realistic to assume kids are truly getting parental consent as they start to form independent identities in early adolescence, she added. Asking for that all the way up to age 16 will hardly improve the odds. And, of course, parents are already some of the biggest violators of their kids’ privacy anyway, posting content and leaving digital footprints well before the age of consent. “Many parents are complicit in this,” Ito said.

That’s not to say the current situation is a catastrophe—there are protections, and even then, many parents ignore them. But the legal framework in the U.S. is patchy, a melange of voluntary industry standards, FTC rules, and state laws. For legal solutions, American privacy advocates often pine for the rigor of Europe, which baked privacy concerns into its earliest laws governing the internet.

COPPA is a sort of “opt-in” privacy band-aid for one set of Americans. Right now, most legal options under discussion such as Senator Markey’s are simply a sort of “opt-in plus,” Ari Ezra Waldman, a professor at New York Law School, told me—expanding that law’s relatively basic guidelines, which govern the collection of data more than its use, to a broader group, or to everyone. But Waldman said the U.S. could do more by shifting toward a European-style “privacy by design”: “We should regulate how companies design their tools, and require that privacy and ethical considerations be embedded into products, rather than just notifying people of how bad their data practices are.”

The latest iteration of Europe’s privacy law is the General Data Protection Regulation (GDPR), which will be put into force in May and includes requirements for kids. While the new rules incorporate aspects of U.S. law like COPPA, European agencies tend to go further in enforcement. The stricter policies there, Chester, of the Center for Digital Democracy, told me, might later be used to fight a “two-front war” for kids’ privacy in the U.S., “pushing for strong public policies in Congress, which are unlikely to be passed, [and] raising concerns at the corporate level, saying, ‘Hey Google, why are you treating kids’ privacy better in the EU than the kids in your own backyard?’”

Ito hopes, however, that the current fears over children’s data privacy don’t overshadow the creative and connective possibilities of the web. When her family, for instance, wants to learn how to fold origami lobsters together, they go to YouTube. In her research, she came across a girl who grew a Minecraft hobby into a Minecraft club at her school, which itself turned into a mini video-production operation. The kid internet, in other words, doesn’t have to be bad—it’s just that the way it’s currently regulated means it often is.