David Grey / Reuters

“Tim over here has the original Linnaeus flower.”

I had come to Cold Spring Harbor National Laboratory for its plants. A transplanted Englishman named Robert Martienssen met me in front of his lab, and we spent the morning admiring his mats of duckweed and tall stands of experimental corn. We went to one of the laboratory greenhouses to meet the farm manager, Tim Mulligan. He brought with him a black plastic pot with a flower.

This post is adapted from Zimmer’s new book.

Mulligan set it down on a counter made of planks, and I leaned in to inspect it. The pot contained a single plant, sprouting a dozen or so bright-yellow, trumpet-shaped blossoms. The petals wrapped around each other to create a long, closed tube that curled out at the end, forming a spiked, five-sided rim.

The flower I was looking at has a clear-cut pedigree. It’s a direct ancestor of a plant that was discovered in 1742 by a Swedish university student named Magnus Zioberg. Zioberg was hiking on an island near Stockholm when he happened to notice a trumpet-flowered plant. It confused him, because—aside from the flowers—it looked like a familiar plant called toadflax. The flowers of normal toadflax plants have a mirrorlike symmetry. They grow a few small yellow petals, some sprouting off to the left and others to the right, and a spike develops at the base of these flowers, pointing toward the ground. The flowers on the plant that Zioberg stumbled across had a circular symmetry instead.

Zioberg plucked the flower out of the ground, pressed it in a book, and brought it back to Uppsala University to show to his professor Olof Celsius. Celsius was thunderstruck. He immediately brought the flower to his colleague—and one of the most important naturalists in history—Carl Linnaeus.

Linnaeus was working at the time on a new system for classifying all plants and animals. It’s the system we still use today. To classify plants, Linnaeus paid particular attention to the shape of their flowers. When he looked at Zioberg’s discovery, he thought Celsius was having a joke at his expense. Celsius must have glued flowers from another species onto a toadflax plant to fool him. But Celsius assured Linnaeus it was genuine.

“This is certainly no less remarkable than if a cow were to give birth to a calf with a wolf’s head,” Linneaus declared. He considered the trumpet-shaped flower a species of its own. He named it Peloria—from the Greek for “monster.”

To make sense of this “amazing creation of nature,” as he called it, Linnaeus speculated that it descended from ordinary toadflax. Pollen from another species had fertilized a toadflax plant, somehow triggering a sudden leap into a new form. To say such things in the 1740s—a century before Mendel’s and Darwin’s work—verged on heresy. Species were supposed to be fixed since creation. Heredity could not abruptly change course and make a new species.

“Your Peloria has upset everyone,” one bishop wrote in an angry letter to Linnaeus. “At least one should be wary of the dangerous sentence that this species had arisen after the Creation.”

In his later years, as he studied other specimens, Linnaeus became less sure of what the plants really were. He discovered that sometimes a single Peloria plant grew a mix of monstrous trumpet flowers and ordinary mirror-like ones. He couldn’t decide whether they were indeed a species of their own or some kind of strange variant that defied botany’s rules.

In the late 1990s, a group of English scientists turned their attention to Peloria, using the tools of molecular biology. Enrico Coen of the John Innes Centre in England and his colleagues examined a gene involved in making flowers, called L-CYC. In order for ordinary toadflax plants to develop flowers, they must switch on the L-CYC gene in the tips of their stems. In Peloria, Coen discovered, L-CYC stays silent.

This difference is not due to a mutation that altered the gene for L-CYC in Peloria. Coen and his colleagues found that the gene is identical in toadflax and Peloria. The difference between them was not in their DNA but around it.

Cells use their genes to make proteins and RNA molecules. And to keep some genes active and some silenced, they surround them with other molecules. Some molecules, called methyl groups, can coat part of a gene. They are often involved in keeping genes shut down. Other molecules coil up long segments of DNA, hiding the genes they contain. The study of genes is genetics; the study of the molecules that control genes is often referred to is epigenetics.

In Peloria, L-CYC had a heavy coating of methyl groups, preventing the flower’s gene-reading molecules from reading it. Coen and his colleagues noticed that as they bred new Peloria plants, they sometimes produced flowers that looked more like those of regular toadflax. When the scientists inspected the L-CYC gene in these throwbacks, they found that the gene had lost some of its methylation, allowing it to become more active again.

In Peloria, it seems, heredity has traveled down two channels. The flower has passed down copies of its genes, which guided the development of toadflax-shaped plants. But these plants also inherited a peculiar pattern of methylation that was not encoded in their genes. At some point before Zioberg stumbled across it in 1742, a toadflax plant accidentally added on methyl groups to its L-CYC gene. By silencing the gene, this methylation caused the flower to develop into a new shape. This newly altered flower then produced seeds, which inherited the same epigenetic mark. They fell to the ground, sprouted, and produced the same monstrously lovely flowers. Over the centuries that followed, some of their descendants lost the epigenetic mark, blooming into ordinary toadflax flowers once more. But other Peloria plants continued to inherit the wolf’s head of botany.


In the 1800s, scientists like Charles Darwin first framed heredity as a scientific question. They wanted to know what each generation transmitted to the next. At the dawn of the twentieth century, researchers spotted the first glimmers of genes. They found a way by which living things today could be correlated with their biological past. This theory of heredity was pitted against Jean-Baptiste Lamarck’s claim that acquired characteristics could be passed down. Lamarck has remained an icon of pre-genetic thinking ever since.

It’s a role that’s unfair both to Lamarck and to history. The inheritance of acquired traits had been widely accepted for thousands of years before Lamarck was born. In Europe, scholars from the Middle Ages to the Enlightenment treated it as fact.

Regardless of whose name should be put on the idea, it continued to fall out of favor over the course of the 20th century. But some scientists continued to fight for conceptual room for more than one form of heredity. If we simply redefine heredity as genetics, they argued, we will never even look for those other channels.

Toward the end of the 20th century, a few cases came to light that looked an awful lot like the inheritance of acquired traits.

In 1984, a Swedish nutrition researcher named Lars Olov Bygren launched a study of people in Överkalix, a remote region of Sweden where he had grown up. For centuries, Bygren’s relatives had eked out a difficult existence along the banks of the Kalix River, fishing salmon, raising livestock, and growing barley and rye. Every few years, they suffered devastating crop failures, leaving them with little food to eat during the six- month- long winters. In other years, the weather would swing far in their favor, bringing bumper crops.

Bygren wondered what sort of long- term effects these drastic changes had on the people of Överkalix. He picked 94 men to study. Studying church records, he charted their genealogies and discovered a correlation between their own health and the experiences of their grandfathers. Men whose paternal grandfathers lived through a feast season just before puberty died years sooner than the men whose grandfathers had endured a famine at that same point in their life.

Women, Bygren found in a later study, also experienced an influence across the generations. If a woman’s paternal grandmother was born during or just after a famine, she ended up with a greater risk of dying of heart disease. It had long been known that a woman’s health while she was pregnant could influence a fetus, but Bygren’s research suggested the effects could stretch even further, to grandchildren or beyond.

Experiments on animals produced some similar results. In the early 2000s, Michael Skinner, a biologist at Washington State University, and his colleagues stumbled across one while they were investigating a fungus- killing chemical called vinclozolin. When Skinner and his colleagues gave vinclozolin to pregnant rats, their offspring, and even their grandsons, developed deformed sperm and other kinds of sexual abnormalities.

Skinner’s work inspired other researchers to look for other kinds of changes that could be inherited. Brian Dias, a postdoctoral researcher at Emory University, wondered if mice might even pass down memories.

Each day, Dias put young male mice in a chamber into which he periodically pumped a chemical called acetophenone. It has an aroma that reminds some people of almonds, others of cherries. The mice sniffed the acetophenone for 10 seconds, upon which Dias jolted their feet with a mild electric shock.

Five training sessions a day for three days was enough for the mice to associate the almond smell with the shock. When Dias gave the trained mice a whiff of acetophenone, they tended to freeze in their tracks. Dias also found that a whiff of acetophenone made the mice more prone to startle at a loud noise. In other trials, Dias would pump an alcohol- like scent called propanol into the chamber instead, without giving the mice a shock. They didn’t learn to fear that odor.

Ten days after the training ended, researchers from Emory’s animal resources department paid Dias a visit. They collected sperm from the trained mice and headed off to their own lab. There they injected the sperm into mouse eggs, which they then implanted into females.

Later, after the pups had matured, Dias gave them a behavioral exam, too. Like their fathers, the new generation of mice was sensitive to acetophenone. Smelling it made them more likely to get startled by a loud sound, even though he had not trained the mice to make that association. When Dias allowed this new generation of mice to mate, the grandchildren of the original frightened males also turned out to be sensitive to acetophenone.

Dias then examined the nervous systems of these mice, hoping to find physical traces of the association. When mice learn to fear acetophenone, previous studies had shown, a particular patch of neurons in the front of the brain gets enlarged. The descendants of the trained mice had the same effect.

The only link from the frightened fathers to their children and grandchildren was their sperm. Somehow, those cells had transmitted more than genes to their descendants. And somehow the animals passed down information not carried in their genes but gained through experience.


To explain such eccentric heredity, some scientists looked toward the epigenome, that collection of molecules that envelops our genes and controls what they do. The distinctive combinations of genes our cells keep switched on help to commit them to becoming muscle, skin, or some other part of the body. These patterns can be remarkably durable, enduring through division after division. That’s how little hearts grow into bigger hearts, instead of turning into kidneys.

Yet the epigenome is not simply a rigid program for turning genes on and off in a developing embryo. It is also sensitive to the outside world. Over the course of each day, for example, our epigenome helps drive our bodies through a biological cycle, activating certain genes during the day and silencing them through the night. The epigenome can alter the workings of genes in response to unpredictable signals as well. When we develop an infection, immune cells reorganize their DNA in order to go into battle mode against pathogens, allowing certain genes to start making proteins while silencing others. And as the immune cells multiply, they pass down this battle-ready epigenome to their descendants as a kind of cellular memory.

The memories we store in our brains may also endure thanks in part to changes we make to the epigenome. Starting in the mid-1900s, neuroscientists found that we sculpt the connections between neurons, pruning some and strengthening others, as new memories form. These patterns can endure for years. More recently, researchers have found that the formation of new memories is accompanied by some epigenetic changes. The coils of DNA in neurons get rearranged, for example, and new methylation patterns get laid down. These durable changes may ensure that neurons preserving long- term memories keep making the proteins they need to keep their connections strong.

The malleability of the epigenome is not an unalloyed good, though. Some studies suggest that stress and other negative influences can alter epigenetic patterns inside our cells, leading to long-term harm.

Some of the strongest evidence for this link has come from the laboratory of Michael Meaney at McGill University. In the 1990s, Meaney and his colleagues started a study to see how rats experience stress. If they put rats in a small plastic box, the animals got anxious, producing hormones that raised their pulse. Some rats reacted more strongly than others to the stress. It turned out that the rats that made more stress hormones had been licked less as pups by their mothers.

Working with Moshe Szyf, a McGill geneticist, Meaney investigated the physical differences that more licking or less licking produced in the animals. Meaney and Szyf inspected the neurons in the hippocampus, a brain region that they knew to be involved in mammalian stress control, and looked closely at their methylation. In rats that get licked a lot, they found relatively little methylation around the gene for the stress- hormone receptor. In rats that get licked a little, the methylation is much greater.

Meaney and Szyf proposed that when mothers lick their pups, the experience alters neurons in the hippocampus: Some of the methylation around their receptor gene gets stripped away. Freed from the methylation, the gene becomes more active, and the neurons make more receptors. In the well- licked pups, these neurons thus become more sensitive to stress, and rein it in more effectively. Rats that get little licking develop fewer receptors. They end up stressed out.

Given that rats and humans are both mammals, it’s possible that children may also undergo long- term changes to their stress levels from their upbringing. In one small but provocative study, Meaney and his colleagues looked at brain tissue from human cadavers. They selected twelve who had died of natural causes, twelve people who had committed suicide, and another twelve who had committed suicide after a history of abuse as children.

Meaney and his colleagues found that the brains of people who had experienced child abuse had relatively more methyl groups around their receptor gene, just as in the case of the under-licked rats. And just as those rats produced fewer receptors for stress hormones, the neurons of victims of child abuse had fewer receptors as well. It’s conceivable that the child abuse led to epigenetic changes that altered emotions in adulthood, snowballing into suicidal tendencies.

One geneticist named Steve Horvath has proposed that our epigenome changes at a steady rate, like the ticking of a biological clock. The idea first came to Horvath in 2011 while he was studying spit. He and his colleagues had collected saliva from sixty- eight people and fished out some cells from the cheek lining that had been shed into the fluid. Initially, Horvath tried to find a difference in the methylation patterns between heterosexuals and homosexuals. But no clear pattern came to light. Hoping to salvage the study, he decided to compare the saliva according to the ages of the subjects.

Horvath and his colleagues found two spots along people’s DNA where the methylation pattern tended to be the same in people of the same age. When they looked at other kinds of cells, they found other places where the methylation changed even more reliably as people got older. By 2012, Horvath was able to look at the methylation at sixteen sites in the DNA of nine different cell types. He could use those patterns to predict people’s ages, with an accuracy of 96 percent.

As provocative as such studies are, it’s still far from clear whether the epigenetic clock matters much. The same uncertainty hovers over studies on how negative experiences can trigger epigenetic changes in the brain and the body. These studies tend to be small, and sometimes when other scientists replicate them, they fail to see the same results. It’s even possible that the way scientists search for epigenetic change may trick them into seeing it where none exists. Perhaps the epigenetic clock is not produced by cells changing their epigenetic marks, for example. Perhaps some types of cells become more common as we get older, and those cells have different epigenetic marks than the cells more common in youth.

These uncertainties have not scared off scientists from studying epigenetics, however. The stakes are just too high. By cracking the epigenetic code, researchers may discover a link between nurture and nature. And if we can rewrite that code, we may be able to treat diseases by altering the way our genes work.


The role that epigenetics plays in our lives remains controversial—but the possibility that it could open a channel of heredity through the generations is vastly more so.

Critics charge that many studies on humans and mice are too small to be trusted. An epigenetic similarity between the generations could be a stastistical fluke rather than a hereditary connection.

But some of the most potent attacks on this form of inheritance have been directed at the molecular details. It’s hard to see how exactly the experiences of parents can reliably mark the genes of their descendants. While it’s true that the methylation pattern in cells can change during people’s lifetimes, it’s not at all clear that those changes can be inherited.

The trouble with this hypothesis is that it doesn’t fit what we know about fertilization. A sperm carries its own payload of DNA, which has its own distinct epigenome as well. For example, sperm have to tightly wind their DNA in order to fit it inside their tiny confines. During fertilization, the sperm’s genes enter the egg, where they encounter proteins that attack the father’s epigenome. As the embryo stars to grow, the epigenetic drama continues. The embryonic cells strip away much of the remaining methylation on their DNA. And then they reverse course and start putting a fresh batch of methyl groups back on.

This new methylation helps cells in an embryo take on new identities. And when the embryo is around three weeks old, a tiny wedge of cells receives a set of signals that tell them they have been picked for immortality. They will become eggs or sperm. These cells alter their epigenome yet again. They strip off much of the methylation from their DNA.

Many scientists doubt that inherited epigenetic marks can survive all this stripping and resetting. If heredity is a kind of memory, methylation suffers radical amnesia in every generation.


The biology of animals may not offer many opportunities for epigenetic inheritance. But some scientists don’t see the door as entirely shut.

Our understanding of epigenetics depends on how well we can see it. When scientists began mapping the methylation that coats DNA, they could barely see it at all. In the 1990s, Enrico Coen could cut out a single gene and inspect it for methylation. Scientists then developed the tools for mapping the methylation across all the DNA in a cell. But they had to pull the DNA out of millions of cells at once to do so. If those cells belonged to subtly different types, each with a different pattern of methylation, the scientists could see only an epigenetic blur. By the 2010s, scientists were learning how to put cells on a kind of microscopic conveyor belt where they could inspect all the methylation in each cell, one at a time.

As our epigenetic focus has sharpened, old assumptions have turned out to be wrong. In 2015, for example, Azim Surani, a biologist at the Wellcome Institute in England, led one of the first studies on the epigenetics in human embryonic cells. In particular, he and his colleagues examined the cells that were on the path to becoming eggs or sperm. They observed these so-called primordial germ cells stripping away most of their methylation before applying a fresh coat. But a few percent of the methyl groups remained stubbornly stuck in place on the DNA.

A lot of the cells shared the same resistant stretches of DNA that held on to their old epigenetic pattern. These stretches contained virus-like pieces of DNA called retrotransposons. They can coax a cell to duplicate them and insert the new copy somewhere else in the cell’s DNA. Methylation can muzzle these genetic parasites.

Retrotransposons typically sit near protein-coding genes, and it is possible that those genes get muzzled, too. Surani and his colleagues found that some of the genes near the stubborn methylation sites have been linked to disorders ranging from obesity to multiple sclerosis to schizophrenia. Based on their experiments, the scientists concluded that these genes are promising candidates for transgenerational epigenetic inheritance.

Scientists have also started turning their attention to other molecules that can control genes. RNA molecules—single-stranded versions of DNA—have emerged as powerful choreographers of the cell. When a sperm cell fertilizes an egg, they combine not just their DNA, but their RNA molecules as well. Researchers are investigating whether RNA molecules have what it takes to create a channel of heredity between the generations.

Antony Jose, a biologist at the University of Maryland, tracks RNA molecules produced inside the body of a tiny worm called Caenorhabditis elegans. RNA molecules created in the worm’s brain can make their way across its body and end up inside its sperm, where it turns off a gene. Other researchers have found that RNA molecules in the worms can turn off the same gene in the next generation, and for several generations after that. It appears that the RNA molecules sustain themselves through the generations by spurring young worms to make more copies of themselves.

We are not worms, of course, but a number of experiments have demonstrated that human cells can send RNA molecules to each other on a regular basis. Very often, they are delivered in tiny bubbles, called exosomes. Scientists have observed more and more types of cells releasing exosomes, and more and more taking them up. In some species, embryos may use exosomes to send signals between parts of the body to make sure they all develop in sync. Heart cells may release them after a heart attack to trigger the organ to repair itself. Cancer cells spew out exosomes with exceptional abandon—probably as a way to manipulate surrounding healthy cells into becoming their servants. In 2014, an Italian biologist named Cristina Cossetti observed that exosomes cast off by cancer cells in male mice could deliver their RNA into their sperm cell. None of this is proof that RNA can sustain epigenetic inheritance. But it certainly makes the idea fun to think about.

But even if there is a link from somatic cells to the germ line and to future generations, it won’t be enough to resurrect Lamarck. What made Lamarck’s theory so seductive in the nineteenth century was the idea that the acquired traits were adaptive. In other words, they helped animals and plants survive, enabling species to fit themselves to their environment. Lamarck believed his version of evolution could explain why species were so well matched to their surroundings. In Lamarck’s world, giraffes stretched their necks and ended up with the longer necks they needed to get food.

Meanwhile, at Cold Spring Harbor Martienssen is studying the organisms with the clearest evidence for epigenetic inheritance: plants.

It’s not just Linnaeus’s monster that offers the evidence. In many species, droughts, insect attacks, and other experiences can alter plants for many generations. Martienssen and his colleagues are using the latest epigenetic technology to figure out exactly what molecules are sustaining those changes.

At one point during my visit, Martienssen surprised me by asking if I had ever heard of Luther Burbank. Burbank was a long-forgotten California plant-breeder. In the late 1800s, he was sensationally famous as a “plant wizard,” concocting new plant varieties as if by hereditary magic. (The Burbank potato and the Shasta daisy are just two examples of his handiwork.)

Working before the dawn of genetics, Burbank offered bizarre, vaguely Lamarckian explanations for his work. He liked to “stir up the heredity” of his plants, he said. The first geneticists took pilgrimages to his California farm, but came away with no new insights. Eventually, Burbank was dismissed as a crank.

But Martienssen remains fond of Burbank. In his homespun way, Burbank seemed to be talking about epigenetics. Martienssen stared off into the middle distance and recited to me a line of Burbank’s that he drops whenever he can into his lectures and papers.

“Heredity,” Burbank declared, “is only the sum of all past environment.”


This post is adapted from Zimmer’s new book, She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.