There is a picture I have in my mind that captures an important facet of childhood. Its caption reads: "Small Human With Marble In Nose." We trot this image out at family gatherings—there is surely a relative on hand who was once such a Small Human—to illustrate the pleasant idea that children, at their most endearing and most exasperating, are explorers/inventors/geniuses in miniature. By this school of thought, jamming a small object up one’s nose is just a foray into The Scientific Method gone a bit awry.
But imagine this picture as a cartoon. Get rid of baby Einstein’s precocious smile, add in a runny nostril (the unobstructed one), and rewrite the caption: SURVIVAL OF THE FITTEST?
This is not just a joke. Developmental and evolutionary psychologists study babies very seriously because they believe that the not-quite-clean slates of baby brains can give us surprising insight into our evolutionary pasts. They reason that the early history of an individual human’s mind is a sort of analogy to the early history of all humankind. Because infant minds have not yet been shaped by the empirical world around them, their behavior is largely the product of elemental hard-wiring in the brain—the kind of circuitry that dates all the way back to our days as hunter-gatherers. In short, all the newfangled nurture of hip mommies and daddies has not yet taken hold in these minds, which leaves scientists with an unusually clear view of very old-fashioned nature.
A study published last week in the journal Cognition by a team of Yale researchers deployed this line of analysis and produced a headline that no doubt caught the eye of many Baby Bjorn-toting parents: Babies Reluctant to Grab Plants. The study reported that infant subjects took much longer to touch plants than a variety of other objects.
The study’s lead author, Annie Wertz, wants to be clear about what this research is not: “We are in no way claiming that infants can be left alone with plants or anything like that,” she says. She reels off this caveat dutifully because she wants to discuss what truly interests her about these latest findings. Wertz's radical hypothesis is that babies’ wariness toward plants is a remnant of an ancient survival mechanism.
For our ancestors, plants were both essential (a well-gathered meal, a well-thatched roof, etcetera), and deadly (an ill-advised snack in the woods, a mis-measured herbal remedy). But dangerous plants don’t announce themselves in any obvious way: They don’t gnash their teeth or flash their claws, and they definitely don’t chase you through the forest. For Wertz, this sets up “a really interesting learning problem.” You live in the midst of one leafy green thing after another, but you have no idea which ones are safe for dinner.
“If we only ate one thing, then you could imagine that maybe there would be some kind of program for identifying that specific thing. But our diets are very broad, and humans live in a wide range of environments. So each individual human, when they are born, has to solve the problem of which plants in this particular environment are the dangerous ones, and which are the edible ones.”
Wertz had read about the strategies that other creatures use to navigate this difficult cost-benefit analysis. There are simple physiological mechanisms that help: for example, chemicals in the guts of many animals will break down plant toxins. And there are more impressive behavioral mechanisms, too. Some animals have learned to merely nibble at plants they are trying for the first time. That way, even if the plant turns out to be hazardous, they’ve only ingested a Whole Foods-sized sample.
According to Wertz’s hypothesis, what wound up working best for early humans was what she calls a “social learning mechanism.” Hunter-gatherers mostly played it safe, avoiding plants until they could reliably figure out if they were edible or not. The most helpful clues, it turns out, came from each other: If Mom eats parsley, you can too.
For humans today, Wertz thinks something astonishingly similar applies. Babies are reluctant to engage with plants, until they pick up social cues from the people around them that indicate whether it’s okay to do so.
In this most recent study, babies sat in their parents’ laps while they were presented with several objects (within reach, but not available for insertion into the nose). Infants were shown a real plant, an artificial plant, and a “novel artifact”—an object that did not look remotely like a plant. Forty-five out of 47 infants took longer to touch the plants (either real or fake) than the other artifact.
First, Wertz considered the dullest explanations for these findings. “We thought, hm, maybe they’re just avoiding leaf-shaped things.” Or leaf-feeling things—the kind of delicate stuff babies may have already learned not to touch. To dispense with this possibility, Wertz and her team used the very leaves from the artificial plants to decorate the “novel artifacts.” They snipped them, dyed them black, and draped them as a fringe. The result is something that looks like a piece of Martian home décor. The leaf material itself didn't seem to be what was deterring the infants.
This was crafty, but created another problem. What if the babies didn't actually dislike the plants; what if they just really liked the alien accessory? Wertz designed another experiment to see if she could dispense with the “novelty” argument: that the infants were just reaching for something new and cool. She kept the “novel artifacts” the same, but substituted the plants with carefully chosen items: a lamp (a familiar object that babies aren’t supposed to touch), a spoon (a familiar object that babies are allowed to touch), and a seashell (a natural, inanimate object that isn't plant-like). In these experiments, the babies took about the same amount of time to touch all the objects, suggesting it wasn't a particular interest in the novel item that had caused them to spurn plants in the first place.
This isn't, of course, an exhaustive process of elimination. You might still think that Wertz has done some rather gymnastic leaping to reach her evolution-based theorizing. She understands—and can easily parrot—this skepticism: “You’re just taking a known phenomena and you’re spinning some kind of story after the fact about what purpose this might have served in an ancestral past that we know nothing about.”
I expected Wertz to respond to such criticism by ticking through the evidence for why she has found the right answer. What she seems to care much more about, however, is defending that she went about it the right way. “Taking an evolutionary approach to human cognition,” she says, “is fundamentally about hypothesis generation.” In many ways, the true labor of Wertz’s research is learning enough—about everything from the chemical compounds in plants to the teeth of prehistoric humans—to come up with a question that is smart and strong enough to be worth answering.
“It could have been the case that babies didn't show this response, in which case I would have been wrong,” Wertz says. “And that would have been okay.” She goes even further: “It is also the case that it could turn out, as with any kind of finding, that this is the result of something else, and I’m open to that possibility. But this is where hypotheses come from.”
This modesty can sound a little cheesy, and we shouldn't simply be won over by the refreshing tenor of someone who admits they might be wrong. But it’s worth noting that this isn't just personal humility. This flexibility is at the core of how experts like Wertz understand evolution—and how amateurs like us should, too.
As Wertz says, “The folk notions of things that evolved is that they are hard-wired and inflexible and instinctual, but that’s really not the case. Any kind of evolved structure can have any kind of configuration.”
As we end our conversation, Wertz is careful to add: “Of course there are other processes at play, too … there are things that can just be noise in the system.” And that, everyone, is a scientific verdict on all the marbles that have ever found their way inside a nostril.