
This article was featured in One Story to Read Today, a newsletter in which our editors recommend a single must-read from The Atlantic, Monday through Friday. Sign up for it here.
People like to say that you are what you eat, but the truth is more like this: In the broad course of human history, we become what we eat. The contents of our ancestors’ dinner tables have slowly but surely left their signatures in the human genome. Learning to cook and soften our food was likely the major driver of our teeth shrinking during the Neolithic age. The lightening of Europeans’ skin is in part a product of dietary changes associated with farming.
The genes that let some adults drink milk with no attendant tummy troubles—a trait commonly called lactose tolerance—are a different story. A few different alleles, or versions of genes that influence a particular trait, can make for comfortable dairy consumption, and they’re all known for their unusually speedy spread. A new study mapping European milk consumption throughout history suggests that humans owe the quick proliferation of lactose tolerance to a legacy of famine and disease that began thousands of years after we became dairy fiends. In other words, lactose-intolerant people have been throwing back dairy for thousands and thousands of years. But whereas I think moaning to my boyfriend about my hot-girl tummy issues is just the sign of a tasty, tasty meal, our lactose-intolerant ancestors were more likely putting themselves through the digestive wringer just so they could survive.
Almost every human is born with the ability to digest the sugar called lactose, which is found in most dairy products as well as breast milk. For the majority of our history, our bodies stopped producing lactase, the enzyme needed to break down lactose, around late childhood. But over the past several thousand years, a few big slices of the world’s population evolved to continue churning out lactase in adulthood. While some of us are stuck with the ghosts of dairy past rattling around our digestive tracts, the lactose tolerant can down milkshakes and soft cheeses with no fear of pain and discomfort. Great for them. No, really, I’m so happy for you guys.
Humans have been drinking milk for nearly 10,000 years, but estimates suggest that the first lactase-persistence allele likely emerged in Europe around 5,000 B.C. Similar mutations followed in other areas of the world, eventually giving way to a period of wild spread of lactase persistence. In Europe, the ability to digest lactose really started to take off about 3,000 years ago. Now the overwhelming majority of people of European descent can chug milk into their golden years. (Nigeria and South Korea, by comparison, are 87 percent and 100 percent lactose intolerant respectively.) Distribution of one trait to such a large degree in just 3,000 years is “really crazy,” Mark Thomas, an evolutionary geneticist at University College London who helped lead the study, told me. It’s a flash in the pan compared with the million years usually needed to cement evolutionary change in a species.
Scientists have struggled to explain this rapid spread for decades. Experts once assumed that the health benefits of being able to drink milk comfortably, such as increased calcium intake in the nutritionally sparse Neolithic landscape, gave those with lactase persistence an advantage in fertility and survival. But contemporary research has found that while, yes, drinking more milk can help kids grow big and strong, milk drinkers don’t seem to live longer or better lives. What’s more, the relationship between lactase-persistence alleles and actual milk consumption varies widely: Culture (not the kind in yogurt), country, and ethnicity might play a larger role in who does and doesn’t drink milk than the enzymes in our guts. “This notion that people who were not lactase persistent wouldn’t drink milk because of symptoms that would be generated by it just isn’t borne out,” says George Davey Smith, an epidemiologist at the University of Bristol and another co-author of the study.
Data that have accumulated over the past two decades suggest that various peoples, not just in Europe, were consuming dairy thousands of years before they had evolved the ability to do so, lactase be damned. Davey Smith, Thomas, and their chemist colleague Richard Evershed organized that evidence into an enormous database. They looked for evidence of contact with dairy in data from more than 13,000 dated pottery shards containing animal fat from across Europe. Then they searched for proof of lactase persistence in ancient-European DNA records, and compared its frequency with the presence of milk-laced pottery. They proposed that the alleles were subject to stronger natural selection in times when people were living in denser communities or when populations dropped—proxies for periods of hardship such as disease and famine. In some cases, lactase persistence didn’t appear widespread until thousands of years after the pottery had first been used to hold milk. “The reality is that for the vast majority of human history, most people were sufficiently close to the breadline that anything that’s nutritious, they would have taken advantage of,” Thomas said.
The closer to starvation a society was, the study authors argue, the more likely it would have been to embrace dairy as a source of nutrition. In a famine, the sudden absence of crops would have made communities both underfed and overly reliant on fresh milk for nutrition. The stakes of digesting that milk would also have skyrocketed. “Diarrhea in a healthy person is a bit embarrassing, right? Diarrhea in somebody who’s severely malnourished is life or death,” Thomas said.
The potentially fatal consequences of GI symptoms haven’t really been extensively studied elsewhere as a major hypothesis for how lactose tolerance evolved, says Andrea Wiley, a medical anthropologist at Indiana University Bloomington. She told me that the idea is plausible, but also “kind of untestable.” Because of the limits of the archaeological record, the study’s authors are working off proxies for measures of milk consumption and genetic change, she noted.
The study also can’t account for the separate development of lactase-persistence alleles in other areas of the world, most notably in Africa, another continent where early evidence of milk consumption has been found. “One of the things about Europe is that it’s got a very good archaeological record,” Wiley said. But in Africa, she said, and particularly East and West Africa, dairying peoples historically “tend to be pastoralist, and more nomadic—the environmental conditions may or may not lead to good preservation.” Looking into lactase persistence in African populations could actually strengthen the case for famine as an evolutionary engine, Wiley said: Lactose intolerance, diarrheal disease, and food scarcity are all more common in African communities than European ones.
This study isn’t the only one with such limitations. Lactase-persistence research has overwhelmingly focused on Europe, Wiley said, in part because of the amount of money poured into it. That pattern, in turn, has skewed the world’s perceptions of milk drinking. “It really has become identified as a European trait—and that has led to all manner of racist ideologies around milk building better bodies,” she said. (Remember when white nationalists were chugging milk by the liter?)
For those of us who are perfectly at home in our dairy-aversive bodies, the good news is that small quantities of milk are unlikely to trigger any serious symptoms if we do decide to scarf down some cheese anyway. Lactose intolerance doesn’t stop us today, Thomas says, and it didn’t stop us thousands of years ago. We just can’t quit the stuff.