Babies, like tiny feral animals, learn with their mouths. It’s one of the earliest forms of problem-solving: Does this hand/iPhone/Cheerio taste good? What’s its texture? What does it do?

But babies also use their mouths to learn even when they aren’t putting foreign objects into them. In a study published earlier this week in the journal Proceedings of the National Academy of Sciences, a team of audiology and psychology researchers from the University of British Columbia established the first direct link between babies’ oral motor skills—the movement of the tongue, lips, and other parts of the mouth—and their ability to understand speech.

The study authors played two different “d” sounds—both common in Hindi, but not found in English—for six-month-olds from English-speaking households. As the sounds played, some of the babies had teething toys in their mouths: either a hard toy that restricted the movement of the tongue, which would impede the ability to make a d sound, or a soft pacifier that went between the gums and left the tongue unaffected (the researchers ran ultrasounds of the babies’ mouths to confirm the effects of each toy).

What they found: The presence of a teething toy didn’t necessarily make a difference, but the type of toy did—babies whose tongues were constrained couldn’t distinguish between the sounds as well as the others.

“Before infants are able to speak, their articulatory configurations affect the way they perceive speech, suggesting that the speech production system shapes speech perception from early in life,” the study authors wrote.

Plenty of previous research has looked at the way that baby brains process speech, but less is known about the relationship between making speech and understanding it. A key part of comprehending language, it seems, may be trying to produce it. Babies are keen observers, but observation can only go so far—learning to walk, for instance, doesn’t just mean watching adults do it; it means putting one foot in front of the other. And understanding language, the study results suggest, doesn’t just mean listening to sounds; it means mimicking them as they happen.

“Until now, research in speech-perception development and language acquisition has primarily used the auditory experience as the driving factor,” Alison Bruderer, the study’s lead author and a postdoctoral fellow in speech sciences at UBC, said in a statement. “Researchers should actually be looking at babies' oral-motor movements as well.” Even before they’re talking, in other words, they’re turning snippets of language around in their mouths like Cheerios, trying to figure them out.