Can Babies Understand the World From Birth?

There are surprising similarities between their brains and adults’.

Riley LeBlanc examines her brain. (Caitlin Cunningham / Quanta)

Rebecca Saxe’s first son, Arthur, was just a month old when he first entered the bore of an MRI machine to have his brain scanned. Saxe, a cognitive scientist at the Massachusetts Institute of Technology, went headfirst with him: Lying uncomfortably on her stomach, her face near his diaper, she stroked and soothed him as the three-tesla magnet whirred around them. Arthur, unfazed, promptly fell asleep.

All parents wonder what’s going on inside their baby’s mind; few have the means to find out. When Saxe got pregnant, she’d already been working with colleagues for years to devise a setup to image brain activity in babies. But her due date in September 2013 put an impetus on getting everything ready.

Over the past couple of decades, researchers like Saxe have used functional MRI to study brain activity in adults and children. But fMRI, like a 19th-century daguerreotype, requires subjects to lie perfectly still lest the image become hopelessly blurred. Babies are jittering bundles of motion when not asleep, and they can’t be cajoled or bribed into stillness. The few fMRI studies done on babies to date mostly focused on playing sounds to them while they slept.

But Saxe wanted to understand how babies see the world when they’re awake; she wanted to image Arthur’s brain as he looked at video clips, the kind of thing that adult research subjects do easily. It was a way of approaching an even bigger question: Do babies’ brains work like miniature versions of adult brains, or are they completely different? “I had this fundamental question about how brains develop, and I had a baby with a developing brain,” she said. “Two of the things that were most important to me in life temporarily had this very intense convergence inside an MRI machine.”

Saxe spent her maternity leave hanging out with Arthur in the machine. “Some of those days, he didn’t feel like it, or he fell asleep, or he was fussy, or he pooped,” she said. “Getting good data from a baby’s brain is a very rare occurrence.” Between sessions, Saxe and her colleagues pored over their data, tweaking their experiments, searching for a pattern in Arthur’s brain activity. When they got their first usable result when he was 4 months old, she said, “I was through the roof.”

A recent paper published in Nature Communications is the culmination of more than two years of work to image brain activity in Arthur and eight other babies. In it, her team finds some surprising similarities in how the babies’ and adults’ brains respond to visual information, as well as some intriguing differences. The study is a first step in what Saxe hopes will become a broader effort to understand the earliest beginnings of the mind.

* * *

Functional MRI is perhaps the most powerful tool scientists have to study brain activity short of opening up the skull. It relies on changes in blood flow in areas of the brain that are more active than others, which creates a detectable signal in an MRI machine. The technique has generated some criticism, as it’s an indirect measure of brain activity, and the simple, striking images it produces rely on behind-the-scenes statistical manipulation. Nevertheless, fMRI has opened up entirely new avenues of research by giving scientists what Saxe calls “a moving map of the human brain.” It has revealed, in incredible detail, how different parts of the brain choreograph their activity depending on what a person is doing, perceiving or thinking.

Some areas of the cortex also seem to be purpose-specific. Nancy Kanwisher, a neuroscientist at MIT and Saxe’s former adviser, is known for discovering an area called the fusiform face area, which responds to images of faces more than any other visual input. Her lab also led the discovery of the parahippocampal place area, which preferentially responds to scenes depicting places. When she was a graduate student in Kanwisher’s lab, Saxe discovered an area of the brain that was devoted to “theory of mind”—thinking about what other people are thinking. Since then, research by several labs has identified regions of the brain involved in social judgments and decision-making.

Saxe, who speaks rapidly and radiates an intellectual intensity, gets most animated by philosophical and deeply fundamental questions about the brain. For her, the next obvious question is: How did the brain’s organization come about? “Seeing in adults these incredibly rich and abstract functions of your brain—morality, theory of mind—it just raises the question, how does that get there?” she said.

Have our brains evolved to have special areas devoted to the things most important for our survival? Or, she said, “is it that we were born with an amazing multipurpose learning machinery that could learn whatever organization of the world it was given to discover?” Do we enter the world with an innate blueprint for devoting parts of our brains to faces, for instance, or do we develop a specialized face area after months or years of seeing so many people around us? “The basic structural organization of the human brain could be similar from person to person because the world is similar from person to person,” she said. Or its outlines could be there from birth.

* * *

Riley LeBlanc spits out her pacifier and starts to cry. A 5-month-old with a mop of curly brown hair, she’s fussing in her swaddle as Heather Kosakowski, Saxe’s lab manager, stands by the hulking MRI machine, housed in the bottom floor of MIT’s Brain and Cognitive Sciences building, and bounces Riley up and down. Lori Fauci, Riley’s mother, seated on the scanning bed, pulls another pacifier from her back pocket to offer her child.

Everything here is designed to soothe Riley. The room is softly lit, and speakers play tinkling, toy-piano versions of pop songs as lullabies (currently: Guns N’ Roses’ “Sweet Child o’ Mine”). On the scanning bed lies a specially designed radio-frequency coil—an angled lounger and baby-sized helmet—to act as an antenna for radio signals during scans. The MRI machine is programmed with special protocols that generate less noise than usual, to avoid harming the babies’ delicate hearing.

It takes a few false starts before Riley is willing to lie in the coil without fussing. Her mother positions herself on her stomach with her face and hands near Riley to soothe her. Kosakowski slides mother and child into the scanner and moves to a windowed anteroom, while Lyneé Herrera, another lab member, stays in the MRI room and gives hand signals to Kosakowski to let her know when Riley’s eyes are open and watching the mirror above her head, which reflects images projected from the back of the machine.

The team’s goal is to collect about 10 minutes of data from each baby while she’s motionlessly watching the videos. To achieve that, the researchers often need to average together data from multiple two-hour sessions. “The more times a baby comes, the more likely we are to get that full 10 minutes,” Kosakowski said. This is Riley’s eighth visit.

When Herrera signals that Riley has stopped napping, Kosakowski initiates the scanner and cues a series of video clips, as babies are more likely to look at moving images than still ones. After a while, Herrera closes her hand, signaling that Riley’s eyes are closed again. “Sometimes I think babies must be getting their best naps here,” Kosakowski said with a laugh.

Studying infants has always required creative techniques. “It’s been an interesting problem,” said Charles Nelson, a cognitive neuroscientist at Harvard Medical School and Boston Children’s Hospital who studies child development, “because you’re dealing with a nonverbal, rhetorically limited, attentionally limited organism, trying to figure out what’s going on inside their head.” Similar techniques are often used to study babies and nonhuman primates, or children with disabilities who are not verbal. “We have a class of covert measures that allows us to peek inside the monkey, the baby, the child with a disorder,” Nelson said.

The simplest is watching their behavior and noting where they look, either by observing them or using eye-tracking technologies. Another is to measure brain activity. Electroencephalography (EEG), for instance, simply requires attaching an adorable skullcap of electrodes and wires to a baby’s head to detect fluctuating brain waves. And a newer technique called near infrared spectroscopy (NIRS) sends light through babies’ thin, soft skulls to detect changes in blood flow in the brain.

Both methods reveal how brain activity changes moment to moment, but NIRS only reaches the outer layers of the brain, and EEG can’t show exactly which brain areas are active. “To study the detailed spatial organization, and to get to deeper brain regions, you have to go to fMRI,” said Ben Deen, first author of the Nature Communications study, who’s now a researcher at Rockefeller University.

Using other methods, researchers have found hints that babies respond differently to visual inputs of different categories, particularly faces. Faces “are a very salient part of environment,” said Michelle de Haan, a developmental neuroscientist at University College London. In the first few weeks of life, an infant’s eyes focus best on objects around the distance of a nursing mother’s face. Some researchers believe babies may have an innate mechanism, deep in the brain, which directs their eyes to look at faces.

There’s evidence that young infants will look longer at faces than other things. A baby’s response to faces also becomes more specialized over time and with experience. For instance, adults have a harder time discriminating between two faces when they’re upside down, but babies under 4 months of age don’t have this bias—they can discriminate between two upside-down faces as easily as two right-side-up ones. After about 4 months, though, they acquire a bias for right-side-up faces. Around 6 months of age, infants who see faces produce an EEG signature of activity that is similar to that of adults who see faces.

Infant brains appear to distinguish between faces and natural scenes. The regions of the infant brain that respond to faces or scenes, respectively, match those of adult brains. The work reveals that at 4 to 6 months of age, the infant brain is already organized in a similar way to an adult brain.
(Department of Brain and Cognitive Sciences and McGovern Institute, MIT / Quanta)

But while this research suggests that babies might have some specialization in their brain for certain categories like faces, Deen said, “we knew very little about the detail of where those signals are coming from.”

For their current paper, Saxe and her colleagues obtained data from nine of the 17 babies they scanned. Though the lab is increasingly relying on outside families recruited into studies, it helped that they had a spate of “lab babies” to start with, including Arthur; Saxe’s second son, Percy; her sister’s son; and a postdoc’s son. They presented babies with movies of faces, natural scenes, human bodies and objects—toys, in this case—as well as scrambled scenes, in which parts of the image are jumbled. Saxe said they focused on faces versus scenes because the two stimuli create a sharp difference in adult brains, evoking activity in very different regions.

Surprisingly, they found a similar pattern in babies. “Every region that we knew about in adults [with] a preference for faces or scenes has that same preference in babies 4 to 6 months old,” Saxe said. That shows that the cortex “is already starting to have a bias in its function,” she said, rather than being totally undifferentiated.

Are babies born with this ability? “We can’t strictly say that anything is innate,” Deen said. “We can say it develops very early.” And Saxe points out that the responses extended beyond the visual cortex (the structures of the brain responsible for directly processing visual inputs). The researchers also found differences in the frontal cortex, an area of the brain involved in emotions, values and self-representation. “To see frontal cortex engagement in a baby is really exciting,” she said. “It’s thought to be one of the last spots to fully develop.”

However, while Saxe’s team found that similar areas of the brain were active in babies and adults, they did not find evidence that infants have areas specialized for one particular input, like faces or scenes, over all others. Nelson, who was not involved in the study, said it suggests that infant brains are “more multipurpose,” he said. “That points out a fundamental difference in the infant brain versus the adult brain.”

* * *

It’s surprising that babies’ brains behave like adults’ brains at all considering how different they look. On a computer screen outside the MRI room at MIT, I can see anatomical images of Riley’s brain that were taken while she napped. Compared to MRI scans of adult brains, in which different brain structures are clearly visible, Riley’s brain seems creepily dark.

“It looks like this is just a really poor image, doesn’t it?” Kosakowski said. She explains that babies at this stage have not yet fully developed the fatty insulation around nerve fibers, called myelin, that makes up the brain’s white matter. The corpus callosum, a yoke of nerve fibers connecting the two hemispheres of the brain, is only dimly visible.

At this age, a baby’s brain is expanding—the cerebral cortex swells by 88 percent in the first year of life. Its cells are also reorganizing themselves and rapidly forming new connections to one another, many of which will get winnowed back throughout childhood and adolescence. At this stage, the brain is astonishingly flexible: When babies have strokes or seizures that require having an entire hemisphere of the brain surgically removed, they recover remarkably well. But there are also limits to this flexibility; babies who experience deprivation or abuse may have lifelong learning deficits.

Studying how healthy human brains develop can help scientists understand why this process sometimes goes awry. It’s known, for instance, that many children and adults with autism have difficulties with social cognition tasks, such as interpreting faces. Are these differences present at the earliest stages of the brain’s development, or do they emerge out of a child’s experience, driven by a lack of attention to faces and social cues?

We’re only beginning to understand how babies’ brains are organized; it will require many more hours collecting data from a larger number of babies to have a fuller picture of how their brains work. But Saxe and her colleagues have shown that such a study can be done, which opens up new areas of investigation. “It is possible to get good fMRI data in awake babies—if you are extremely patient,” Saxe said. “Now let’s try to figure out what we can learn from it.”

This article appears courtesy of Quanta Magazine.