What Parents Did Before Baby Formula

The shortage is a calamity—not a victory for breastfeeding.

A baby bottle
Getty

About the author: Carla Cevasco is a professor of American studies at Rutgers University–New Brunswick. She is the author of Violent Appetites: Hunger in the Early Northeast.

The baby was just two weeks old, and hungry. Elizabeth Hanson tried to breastfeed, but didn’t have enough milk. With terror, she watched as her daughter lost weight, tiny bones protruding from her skin.

In America, in modern times, most parents can count on multiple safe, healthy options for feeding an infant: breast milk or formula. That is, unless they are experiencing the impacts of the current formula shortage, as thousands of families across the United States are.

But in 1724, Elizabeth Hanson couldn’t turn to formula when her milk dried up. Her story illustrates the nightmarish realities that confronted families before the development of modern commercial formula in the mid-20th century.

Hanson, an English colonist from New Hampshire, was captured from her home during Dummer’s War by Native American Wabanaki raiders alongside her infant. Famished, exhausted, and traumatized by her ordeal, she lost her milk supply. She fed her baby broth, when she could get it; when no other food was available, she warmed water in her mouth and let it dribble down over her breast for her infant to suck. But the baby was starving. Hanson later related her fear that the child was “more like[ly] to die than live.”

One of her captors, an older woman, noticed Hanson’s struggle. The woman showed Hanson a Wabanaki recipe for infant food: make walnut milk, and then boil it with fine cornmeal. This food was “very nourishing to the babe,” who “began to thrive and look well.”

I’m a scholar of the history of feeding infants and children in early America, and my research is full of stories of hungry babies like this one. Hanson, and the Wabanaki woman who saved her baby’s life, lived in an era when many babies who could not have breast milk died. In the 18th century, as in our time, some birthing parents and babies struggled to breastfeed. Milk supply lagged, nipples cracked and split, ducts blocked, abscesses and mastitis took hold (and, before the invention of antibiotics, could be deadly)—that is, if the mother hadn’t died in childbirth. Prematurity, tongue-tie, cleft palate, or other physiological problems kept infants from latching on.

Social factors might prevent breastfeeding too. Slaveholders forced enslaved women back into the fields soon after birth; mothers in poverty returned to work because they needed the pay. Moreover, many cultures recognized the partially contraceptive effects of lactation, or had taboos against sexual activity for nursing people. Historians including Paula Treckel and Richard Follett explain that a husband or enslaver who wanted a woman to return to fertility might deny her child the breast.

If the birthing parent could not breastfeed, for whatever reason, the family sought out other sources of breast milk. In the 17th and 18th centuries, most women spent decades pregnant or nursing, which meant that friends, neighbors, or relatives commonly nursed one another’s children. The wealthy could afford wet nurses, either enslaved or paid servants, who breastfed others’ children under the most tragic of circumstances, something that has been well documented by the historians Stephanie E. Jones-Rogers, Marissa C. Rhodes, and Janet Golden. Though some nurses were available because their infant had already died, others nursed their enslavers’ or clients’ children at the expense of their own, many of whom starved.

Parents who could not afford a wet nurse turned to alternative baby foods. In early modern Europe and early America, caregivers mixed animal milk, water, or broth with flour, bread, or other grains, much like the Wabanaki blend of nut milk and cornmeal.

Such foods would have offered hydration and calories, but these benefits could come at a terrible cost. Spoiled or contaminated with pathogens in the days before modern food-safety standards, alternative infant foods could be deadly. Even at their best, they were rarely nutritionally sufficient for a child. Hanson’s experience of a baby thriving on an alternative diet was the exception, not the rule.

As the American food supply industrialized in the late 19th century, a new crop of milk substitutes became available. Their advertisements promised plump, happy babies. Such products were essential for the increasing numbers of women working outside the home, whether in domestic service or in Gilded Age factories that offered no worker protections and certainly no lactation rooms (the technology of pumping was extremely rudimentary at the time, anyway). But the plump infants on their labels notwithstanding, such foods were not necessarily more nutritionally complete than their homemade forebears of previous centuries.

Then there was the problem of cow’s milk—unpasteurized, unrefrigerated, occasionally adulterated, and shipped in open containers prone to contamination. At the turn of the 20th century, according to Jacqueline Wolf’s Don’t Kill Your Baby: Public Health and the Decline of Breastfeeding in the Nineteenth and Twentieth Centuries, Chicago public-health authorities estimated that bottle-fed babies were 15 times more likely to die than their breastfed counterparts. Close to 13 percent of all infants born in the United States died before their first birthday.

In response to the infant-feeding crisis, as Wolf relates, public-health officials spent the first decades of the 20th century cleaning up the milk supply. Pediatricians advised complicated “scientific” approaches to breastfeeding, such as scheduled feedings that inhibited milk supply. Rather than increasing breastfeeding rates, these innovations drove even more parents to bottle-feed. By the late 1960s, notes the historian Amy Bentley, only 20 to 25 percent of babies in the U.S. began their lives breastfeeding. The rest consumed formula, whether mixed at home from ingredients such as condensed milk and corn syrup or in the premixed forms that became commercially available in the 1950s.

Not until the 1960s and 1970s would countercultural movements begin to bring breastfeeding back into vogue. The argument that “breast is best,” taken up by medical and public-health authorities, has profoundly influenced policies on infant feeding in the late 20th and early 21st centuries, even as some research has pushed back against sweeping claims about breastfeeding’s health benefits to children.

Today in the United States, 84 percent of babies start breastfeeding at birth, though only 25.8 percent continue to exclusively breastfeed through six months of age as the CDC recommends. Disparities in breastfeeding rates reflect larger race and class disparities, with higher breastfeeding rates among wealthy white women.

The United States could, and should, do far more to support new parents. Policies such as paid family leave would enable more parents to breastfeed and to sustain breastfeeding for longer, if they so choose. But even if this country had the best social support system in the world, some families would still need or want to use formula.

The history of infant feeding before modern formula should caution us against gloating, as some have on social media, that the formula shortage will drive more parents to breastfeed. The past offers us a grim warning: Without safe, nutritionally complete alternatives to breast milk, infants will die.

The formula shortage is not a victory for breastfeeding. It is a calamity for families who, like families throughout history, just want to feed their children.