Do Genes Really Augur Your Future?

More

For some people, genes are key to predicting our future health. For others, genes as crystal balls are overhyped. Let's call it a truce because both sides are right.

visualization_615.jpg

A visualization of the human genome (AAAS/Science).

Recently, two headlines announced important news about the impact of genes in assessing and predicting future health. These came from the New York Times, but could have been from any number of other media outlets that carried the stories:

Scientists Link Gene Mutation to Autism Risk
Study Says DNA's Power to Predict Illness Is Limited

The casual reader might be excused if they took away the idea that scientists had on the one hand discovered a rogue gene responsible for autism, and on the other had found that DNA isn't very helpful in predicting disease. These readers might also be forgiven if they wondered how these two findings could each be true.

In fact, both headlines are correct - though not entirely. This is because geneticists over the years have identified a slew of gene markers linked to disease, some of which have turned out, upon further research, to be useful for predicting and understanding disease. Others are not because most of what happens to the majority of us genetically is far more complex and nuanced.

Yet genetics in the public forum is often a discussion of oversimplification and extremes. Some scientists, entrepreneurs, and journalists portray genes and gene markers as near-magical fortune-tellers about a person's health future. Others claim that the first group has overhyped genetics and underplayed the role of the environment and other factors that also impact disease.

The "mighty gene" storyline has its roots in the late 1980s and 1990s effort to sequence the human genome. Boosters in science and industry elevated genes to superstar status in part because they genuinely believed that DNA was the key driver in even common diseases. This also helped to sell a Human Genome Project that required billions of dollars from the U.S. Congress - and billions more from investors to sequence DNA and to hopefully turn this knowledge into drugs and other treatments.

Fifteen years ago the Genes-'R-Everything fervor was so convincing that it attracted the tech-dystopia police - those thinkers and artists who are always looking for worst-case scenarios of technology run amok. In this case, the prospect of a world where genes truly were paramount led to movies like Gattaca, which in 1997 depicted a world where one's DNA determined everything from lovers to jobs - and about one man's effort to overcome his genetic deficits.

Thankfully, Gattaca's assumptions about the deterministic power of genes was wrong, although the director and screenwriter Andrew Niccol can be pardoned if he believed the hyperventilated talk at the time about the potential power of genes to not only diagnose and treat disease, but to predict a person's bio-future.

The appearance of Gattaca and other deterministic discourse brought forth cries of "DNA hype" by the time the human genome was fully sequenced in 2003. This crescendoed in late 2007 when the first direct to consumer genetic testing companies, 23andme and deCodeme, appeared with products that claimed to offer customers predictive risk factors for future disease, along with probabilities for having other traits like curly hair.

Many geneticists - some of them the same ones who lauded the future of genetics in the 1990s - decried the commercialization of genetics as promising too much.

Yet like many myths, the debate over what genes really do dates back to time even before the nineties, to 1953 and the discovery by James Watson and Francis Crick (with a major assist from Rosalind Franklin) that DNA is a double helix.

Crick added to the cult of the super-gene in the 1960s with his notion of a "Central Dogma" in genomics - the idea that one gene equaled the production of one protein (proteins are what genes are coded to make in a cell) which equaled one disease or trait - the point being that genes were the key.

Few people know, however, that Crick's "Central Dogma" was a joke. A man who loved to poke fun, Crick was also a vociferous atheist who disliked dogmas of all kinds - including those in science. He created the Central Dogma as a humorous reaction to people that believed that genes were everything, which most scientists even then knew was an oversimplification. (For more on this history check out my book, Masterminds: Genius, DNA, and the Quest to Rewrite Life).

And yet, to add to the complexity inherent in genetics, the central dogma in some cases is true. For instance, there are rare genetic mutations - glitches in critical sequences of DNA - that are directly responsible for diseases such as Down syndrome and Tay-Sachs. For these terrible and usually fatal conditions, single mutations are highly predictive.

For most common diseases, however, this has not turned out to be the case. The impact of single genetic mutations in auguring risk factors for, say, diabetes and many cancers, is at best only slightly more informative than knowing one's average risk for these maladies.

This is the point of the study published last week in Science Translational Medicine, which looked at the predictive power of genes for 24 common diseases. Researchers at Johns Hopkins studied the genetics and the outcomes of over 53,000 twins born with identical DNA. They discovered that for 20 major diseases the genes had little or no extra predictive power.

This is what the headline above reports, which seems like a victory for the "genes are overhyped" camp. Yet the news here is also more nuanced. It turns out that for the four other diseases analyzed by the Hopkins team - Alzheimer's disease, autoimmune thyroid disease, Type 1 diabetes and heart disease for men - genetic tests can identify up to 75 percent of those who will get these diseases.

Neither does the headline above about new gene mutations for autism tell the whole story about that discovery. As the Times story under the headline explains, the newly identified mutations - made by three different teams at Yale, Harvard and at the University of Washington in Seattle and reported in Nature - are extremely rare, impacting only a handful of patients. Nor do they seem to have much relevance to diagnosing, treating, or predicting autism, though the researchers believe the discoveries could be important for better understanding mechanisms of the disease.

While I was writing my recent book, Experimental Man, and after its publication, scientists identified over 23,000 personal genetic risk factors for me - everything from a low risk of having brown eyes (true: my eyes are blue) to a high risk for Parkinson's disease (false: at age 54 I thankfully do not have any sign of this condition). Other risk factors suggest a high probability that I will suffer side effects from certain drugs, such as statins, which I will keep in mind should my cholesterol soar.

Otherwise, my vast library of possible genetic futures has not changed my life - in part because I'm not sure what to believe given the current dialectic of genes as vital predictors to some and as overblown to others. I expect this to change as interpretations of personal genetic traits improve, but this remains in the future.

Nor does it make sense to emphasize the augurs for the future in our DNA when they are just part of the equation telling us what is happening, or might happen, to our bodies in time. Other factors include the impact of our environment - what we eat and chemicals we are exposed to - and also what is going on in certain proteins in our body.

The sooner we normalize the storyline about DNA, the faster genetics will take its rightful place in our science and in our imaginations as one of several remarkable and critical elements that make us who we are - and what we might become in the future.

Jump to comments
Presented by

David Ewing Duncan is a journalist in San Francisco. He is also a television, radio, and film producer, and he has written eight books. His most recent e-book is entitled When I’m 164: The Science of Radical Life Extension, and What Happens If It Succeeds. More

Duncan's previous books include Experimental Man: What one man's body reveals about his future, your health, and our toxic world. He is a correspondent for Atlantic.com and the Chief Correspondent of public radio's Biotech Nation, broadcast on NPR Talk. He has been a commentator on NPR's Morning Edition, and a contributing editor for Wired, Discover and Conde Nast Portfolio. David has written for The New York Times, Fortune, National Geographic, Harper's, The Atlantic, and many other publications. He is a former special correspondent and producer for ABC Nightline, and correspondent for NOVA's ScienceNOW! He has won numerous awards including the Magazine Story of the Year from the American Association for the Advancement of Science. His articles have twice been cited in nominations for National Magazine Awards, and his work has appeared twice in The Best American Science and Nature Writing. He is the founding director of the Center of Life Science Policy at UC Berkeley, and a founder of the BioAgenda Institute. His website is www.davidewingduncan.com

Get Today's Top Stories in Your Inbox (preview)

Sad Desk Lunch: Is This How You Want to Die?

How to avoid working through lunch, and diseases related to social isolation.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Where Time Comes From

The clocks that coordinate your cellphone, GPS, and more

Video

Computer Vision Syndrome and You

Save your eyes. Take breaks.

Video

What Happens in 60 Seconds

Quantifying human activity around the world

Writers

Up
Down

More in Health

From This Author

Just In