Genes Are Overrated

Their discovery wasn’t predestined, nor do they dictate our destinies—and current ideas about them may die.

Illustration by Julien Pacaud; Dennis Galante / Corbis

In the Darwinian struggle of scientific ideas, the gene is surely among the select. It has become the foundation of medicine and the basis of vigorous biotechnology and pharmaceutical industries. Media coverage of recent studies touts genes for crime, obesity, intelligence—even the love of bacon. We treat our genes as our identity. Order a home genetic-testing kit from the company 23andMe, and the box arrives proclaiming, “Welcome to you.” Cheerleaders for crispr, the new, revolutionarily simple method of editing genes, foretell designer babies, the end of disease, and perhaps even the transformation of humanity into a new and better species. When we control the gene, its champions promise, we will be the masters of our own destiny.


The gene has now found a fittingly high-profile chronicler in Siddhartha Mukherjee, the oncologist-author of the Pulitzer Prize–winning The Emperor of All Maladies, a history of cancer. The Gene’s dominant traits are historical breadth, clinical compassion, and Mukherjee’s characteristic graceful style. He calls it “an intimate history” because he shares with us his own dawning awareness of heredity and his quest to make meaning of it. The curtain rises on Kolkata, where he has gone to visit Moni, his paternal cousin, who has been diagnosed with schizophrenia. In addition to Moni, two of the author’s uncles were afflicted with “various unravelings of the mind.” Asked for a Bengali term for such inherited illness, Mukherjee’s father replies, “Abheder dosh”—a flaw in identity. Schizophrenia becomes a troubling touchstone throughout the book. But the Indian interludes are tacked onto an otherwise conventional triumphalist account of European-American genetics, written from the winners’ point of view: a history of the emperor of all molecules.

In 1931, the English historian Herbert Butterfield called this approach “the whig interpretation of history.” Most historians, he wrote, were the epitome of the 19th-century English gentleman: “Protestant, progressive, and whig.” The Whig historian “very quickly busies himself with dividing the world into the friends and enemies of progress.” The danger of Whig history is that it justifies the dominance of the ruling class as the outcome of inexorable natural forces. It is especially seductive when writing about science, for scientific knowledge does indeed progress. When Butterfield wrote The Origins of Modern Science (1949), even he produced an inadvertent model of the form.

Mukherjee gives us a Whig history of the gene, told with verve and color, if not scrupulous accuracy. The gene, he tells us, was first described by the Augustinian friar Gregor Mendel, in the mid-19th century. Tragically, no one noticed—not even the great Charles Darwin. “If Darwin had actually read” the reference to Mendel in a volume on Darwin’s own shelves, Mukherjee writes, it “might have provided the final critical insight to understand his own theory of evolution.” The “missing link” in Darwin’s day, he continues, was “information,” by which he means genetic or hereditary information.

As Mukherjee’s story enters the 20th century, the link is still missing. He writes that the Rockefeller Institute’s Phoebus Levene thought the “comically plain” structure of DNA “disqualified it as a carrier of genetic information”; he even called DNA, Mukherjee reports, a “stupid molecule.” But decades later, at the same institution, the truth would out. The shy and bespectacled Oswald Avery thrust “DNA, once the underdog of all molecules … into the limelight” by showing, in 1944, that it, and not protein, was the “transforming principle” that turns harmless pneumococcus bacteria into virulent pneumonia-causing germs. In 1953, the brazen Watson and Crick got the credit for solving the double helix, while the heroic Rosalind Franklin and Maurice Wilkins were slighted, their roles minimized.

On we march, past some of the greatest and most ambitious minds of science. They crack the genetic code and tweeze apart the Swiss-watch mechanism by which the cell reads out genetic instructions to build functional proteins. They develop genetic engineering in the 1970s. They sequence the human genome in the 1990s. We also meet villains, such as the eugenicists—both American and German—who subverted genetic knowledge, steering it toward nefarious ends. Mukherjee very quickly busies himself with dividing the world into the friends and enemies of scientific Truth.

The antidote to such Whig history is a Darwinian approach. Darwin’s great insight was that while species do change, they do not progress toward a predetermined goal: Organisms adapt to local conditions, using the tools available at the time. So too with science. What counts as an interesting or soluble scientific problem varies with time and place; today’s truth is tomorrow’s null hypothesis—and next year’s error.

This approach lets us see that DNA was not the “underdog of all molecules.” Its structure was considered anything but “comically plain.” Nobel Prizes were awarded three times for elucidating aspects of it: in 1910 (Albrecht Kossel), 1957 (Alexander Todd), and 1962 (Watson, Crick, and Wilkins). There’s no evidence that Phoebus Levene—Kossel’s student—called it a “stupid molecule,” as Mukherjee claims. Max Delbrück did, in the mid-1940s, after Oswald Avery and colleagues had shown it to be the molecule of heredity in pneumococcus. Delbrück, Watson’s most important mentor, used such blunt skepticism to spur scientific rigor among his followers. The “stupid molecule” remark, then, is best understood as prologue to the solution of the double helix in 1953, rather than as an obstacle to its having been solved sooner.

Magazine Cover image

Explore the June 2016 Issue

Check out more from this issue and find your next story to read.

View More

Before Watson and Crick described the gene as a sequence of DNA, visualized as a succession of letters—like a line of computer code—terms such as information would have been nonsensical. Genes had been imagined as beads strung along the chromosomes. They didn’t “encode” anything; they simply carried traits. The term gene wasn’t coined until 1909. Before the turn of the 20th century, Mendel’s elemente were not thought of as physical things. They were mere abstractions. Saying that Darwin lacked the concept of information is like pointing out that T. rex lacked an iPhone.

If Mukherjee had actually read the reference to Mendel he cites from Darwin’s bookshelf, he would have seen that it discussed hybrids and pollination. It gave not a clue to what would later be called Mendel’s laws of heredity. Moreover, Darwin understood his own theory perfectly: It assumes that heritable variation occurs, but it does not depend on knowing where that variation comes from.

This handful of errors, drawn from a sackful of options, illustrates a larger point. The Whig interpretation of genetics is not merely ahistorical, it’s anti-scientific. If Copernicus displaced the Earth from the center of the universe and Darwin displaced humanity from the pinnacle of the organic world, a Whig history of the gene puts a kind of god back into our explanation of nature. It turns the gene into an eternal, essential thing awaiting elucidation by humans, instead of a living idea with ancestors, a development and maturation—and perhaps ultimately a death.

There is a subtler gene, and Mukherjee acknowledges it when he doffs his history hat and dons his white coat. Over the past three decades, the rise of genomics—the move from studying single genes to analyzing and comparing whole genomes—has led to a newly sophisticated understanding of how our DNA influences disease and behavior. As a clinician, Mukherjee grasps this complexity. He understands the humanitarian cost of essentializing the gene. “Genes cannot tell us how to categorize or comprehend human diversity,” he writes. “Environments can, cultures can, geographies can, histories can.” He has well-justified qualms about scientific hubris, about technology getting ahead of ethics. He details with due solemnity and reflection the recklessness of early gene therapy. Its few successes, he takes care to note, were overshadowed by tragedies such as the case of Jesse Gelsinger in 1999. When Gelsinger, a teenager from Arizona with a rare disorder, was injected with a modified virus carrying a supposedly corrective gene, a cascade of unintended consequences was triggered. His organs shut down and he died.

Mukherjee writes eloquently about the limitations of medical genetics. Thus far, he observes, scientists have compiled an impressive (if incomplete) “backward catalog” of gene function: Given a person with a set of symptoms, what gene variants does one tend to find? But clinical genetics needs a forward catalog: “If a child has a mutant gene, what are the chances that he or she will develop the syndrome?” In many cases, he notes, such a catalog may be unattainable. Genetic tests for complex diseases such as schizophrenia or autism are unlikely to be very predictive. As Bryna Siegel, an autism expert at the University of California at San Francisco, has put it, genetic counselors may have to say, “Mrs. Smith, here are the results of your amnio. There’s a one-in-10 chance that you’ll have an autistic child, or the next Bill Gates. Would you like to have an abortion?”

Ironically, the more we study the genome, the more “the gene” recedes. A genome was initially defined as an organism’s complete set of genes. When I was in college, in the 1980s, humans had 100,000; today, only about 20,000 protein-coding genes are recognized. Those that remain are modular, repurposed, mixed and matched. They overlap and interleave. Some can be read forward or backward. The number of diseases understood to be caused by a single gene is shrinking; most genes’ effects on any given disease are small. Only about 1 percent of our genome encodes proteins. The rest is DNA dark matter. It is still incompletely understood, but some of it involves regulation of the genome itself. Some scientists who study non-protein-coding DNA are even moving away from the gene as a physical thing. They think of it as a “higher-order concept” or a “framework” that shifts with the needs of the cell. The old genome was a linear set of instructions, interspersed with junk; the new genome is a dynamic, three-dimensional body—as the geneticist Barbara McClintock called it, presciently, in 1983, a “sensitive organ of the cell.”

The point is not that this is the correct way to understand the genome. The point is that science is not a march toward truth. Rather, as the author John McPhee wrote in 1967, “science erases what was previously true.” Every generation of scientists mulches under yesterday’s facts to fertilize those of tomorrow.

“There is grandeur in this view of life,” insisted Darwin, despite its allowing no purpose, no goal, no chance of perfection. There is grandeur in a Darwinian view of science, too. The gene is not a Platonic ideal. It is a human idea, ever changing and always rooted in time and place. To echo Darwin himself, while this planet has gone cycling on according to the laws laid down by Copernicus, Kepler, and Newton, endless interpretations of heredity have been, and are being, evolved.