For some people, genes are key to predicting our future health. For others, genes as crystal balls are overhyped. Let's call it a truce because both sides are right.
Recently, two headlines announced important news about the impact of genes in assessing and predicting future health. These came from the New York Times, but could have been from any number of other media outlets that carried the stories:
The casual reader might be excused if they took away the idea that scientists had on the one hand discovered a rogue gene responsible for autism, and on the other had found that DNA isn't very helpful in predicting disease. These readers might also be forgiven if they wondered how these two findings could each be true.
In fact, both headlines are correct - though not entirely. This is because geneticists over the years have identified a slew of gene markers linked to disease, some of which have turned out, upon further research, to be useful for predicting and understanding disease. Others are not because most of what happens to the majority of us genetically is far more complex and nuanced.
Yet genetics in the public forum is often a discussion of oversimplification and extremes. Some scientists, entrepreneurs, and journalists portray genes and gene markers as near-magical fortune-tellers about a person's health future. Others claim that the first group has overhyped genetics and underplayed the role of the environment and other factors that also impact disease.
The "mighty gene" storyline has its roots in the late 1980s and 1990s effort to sequence the human genome. Boosters in science and industry elevated genes to superstar status in part because they genuinely believed that DNA was the key driver in even common diseases. This also helped to sell a Human Genome Project that required billions of dollars from the U.S. Congress - and billions more from investors to sequence DNA and to hopefully turn this knowledge into drugs and other treatments.
Fifteen years ago the Genes-'R-Everything fervor was so convincing that it attracted the tech-dystopia police - those thinkers and artists who are always looking for worst-case scenarios of technology run amok. In this case, the prospect of a world where genes truly were paramount led to movies like Gattaca, which in 1997 depicted a world where one's DNA determined everything from lovers to jobs - and about one man's effort to overcome his genetic deficits.
Thankfully, Gattaca's assumptions about the deterministic power of genes was wrong, although the director and screenwriter Andrew Niccol can be pardoned if he believed the hyperventilated talk at the time about the potential power of genes to not only diagnose and treat disease, but to predict a person's bio-future.
The appearance of Gattaca and other deterministic discourse brought forth cries of "DNA hype" by the time the human genome was fully sequenced in 2003. This crescendoed in late 2007 when the first direct to consumer genetic testing companies, 23andme and deCodeme, appeared with products that claimed to offer customers predictive risk factors for future disease, along with probabilities for having other traits like curly hair.
Many geneticists - some of them the same ones who lauded the future of genetics in the 1990s - decried the commercialization of genetics as promising too much.
Yet like many myths, the debate over what genes really do dates back to time even before the nineties, to 1953 and the discovery by James Watson and Francis Crick (with a major assist from Rosalind Franklin) that DNA is a double helix.
Crick added to the cult of the super-gene in the 1960s with his notion of a "Central Dogma" in genomics - the idea that one gene equaled the production of one protein (proteins are what genes are coded to make in a cell) which equaled one disease or trait - the point being that genes were the key.