If the prevalence and commonality of death has had any positive side effect on Louisiana—which has one of the lowest life expectancies in the U.S.—it’s that residents have attuned themselves to its context. “Early on, I got some sense of history and how ages compare, and how one of the responsibilities we face in this age is to be conscious of what’s unique to it,” says author Anne Rice, one of New Orleans’s most famous daughters. “If you’re aware that in 1850 people starved to death in the middle of New Orleans or New York, that’s a dramatic difference between past and future.”
Rice’s classic novels—Interview with the Vampire, The Vampire Lestat, Queen of the Damned, and many more—predate the current vampire craze. Her oeuvre still stands above most of the genre, however, because it represents a unique approach not replicated even decades after many of the books first appeared: New Orleans framed Rice’s perspective as she grew up there. Modern metropolises have transformed their environs into finely tuned systems of order, but the Crescent City teems with a charmingly antiquated natural chaos. The city offers a living, breathing reminder of the past—and, therefore, of how far humanity has come.
“The failure of most vampire literature is that the authors can’t successfully imagine what it’s like to be 300 years old. I try really hard to get it right,” Rice says. “I really love taking Lestat”—her most famous character—“into an all-night drugstore and having him talk about how he remembers in 1789 that not a single product there existed in any form that was available to him as a young man in Paris. He marvels at the affluence and the wealth of the modern world.”
To a caveman, modern humans might appear not unlike Lestat and his vampire kin. We don’t necessarily consume blood to live, nor can we transform into bats, wolves, or mist, but we do have a host of seemingly superhuman powers. Chief among those, to the primitive human, would be our ability to live long lives.
If a caveman were exceptionally lucky, he might have made it to his 40s, but he more than likely would have succumbed to pneumonia, starvation, or injury before his early 20s—if he survived infancy in the first place, that is. Life expectancy for humans more than 10,000 years ago was short and didn’t improve much for a long time. In ancient Rome, the average citizen lived to only about age 24. But most counted themselves fortunate to get even that far; more than a third of children died before their first birthday. A thousand years later, expectations looked much the same.
Over the course of the next 800 years, people in the more advanced parts of the world added only 15 years to their life expectancy. An average American in 1820 could expect to see 39. Lifespans started to pick up in the early 19th century—around the same time that vampire myths were proliferating in Europe—and really sped up in the 20th thanks to a decline in infant mortality and improvements to health in general. By 2010, the average U.S. life expectancy had nearly doubled from two centuries prior, at 78 years, with similar results in other developed countries. To a caveman, or an average Roman, that would seem like an eternity.
Rice recognizes this perspective. Even with Louisiana’s comparatively low life expectancy, she and others from the Pelican state are still far better off than most people at any point in history. “I would be dead if we were in the 19th century,” says the septuagenarian. “But we’re living in the most wonderful age. Never before has the world been the way it is for us. There’s never been this kind of longevity and good health.”
One question that inevitably arises when talking about living longer is, are we living better? A person might live to 100 today, but what’s the quality of those later years?
The question can’t be answered empirically unless we consider what used to make us sick and kill us. The top three killers of Americans in 1900—pneumonia or influenza, tuberculosis, and gastrointestinal infections—don’t appear on the 2010 list, banished to manageability along with historic illnesses such as smallpox, scurvy, and rubella. Today’s top three—heart disease, cancer, and noninfectious airways disease—stand apart. Unlike their predecessors, they’re not infectious; instead they’re environmental, self-inflicted, or genetic. Some doctors believe that makes them eminently more treatable. Others think we’re entering a technology-driven healthcare revolution that not only will beat back some of the worst killers but also greatly improve the quality of life after illness.
Health gadgets and apps are proliferating quickly and comprehensively. With inexpensive heart-rate monitoring and step-counting wristbands such as the Nike Fuelband and the Fitbit becoming a hit with consumers in recent years, inventors and entrepreneurs are flooding the market with all manner of self-tracking tools. Sensoria’s Fitness Socks gauge how well you walk, your individual footfalls and gait, them give you an overall sense of your foot health. The HapiFork tracks how quickly you eat and chew, and buzzes if you’re going too fast. No human function activity can’t be tracked, measured, and corrected.
The proliferating of self-tracking means doctors and individuals are assembling an increasing wealth of data, which inevitably will cause healthcare to become more personalized. Like fingerprints, each person is different because he or she possesses a unique biological makeup, complete with its own nuances and combinations of health conditions. That’s why so many mass-market drugs either don’t work or come with a terrifyingly long list of possible side effects. With better data and the spread of individualized health information, pharmaceutical companies increasingly can specialize and improve drugs and treatments for smaller groups of people, as they have with certain types of cancer and cystic fibrosis.
On the diagnosis side, supercomputer assistants—some derived from the likes of IBM’s Jeopardy champion, Watson—are aiding doctors in crunching all that data to generate better assessments. Put all those pieces together, and those longer lives that people are experiencing don’t have to teem with pain and misery. “People don’t want to make it to a certain year, they want to make it to a certain quality of life,” says cardiologist Eric Topol. “Decreasing the burden of chronic diseases, that’s where it’s at. This will transcend the old dinosaur era of medicine.”
Whether we’re living better is one of the most subjective questions we can ask ourselves since so many factors come into play. Age and era are the biggest. If you had asked a 30-year-old in the 18th century to rate her quality of life, her answer would have differed widely from a similarly aged person living in the developed world right now. Yet a 90-year-old today might feel the same as that 30-year-old three centuries prior. Neither person would have the necessary context to consider the other’s life.
“We’re seeing death in a new way,” says Rice. “Instead of taking it for granted, the people I know see it as a personal catastrophe. I get emails from people who are actually surprised that someone has died. They regard it as an injustice. I understand their feelings, I get it, but this is a fairly new perspective on death. Nobody in the 1900s would have regarded death as a personal catastrophe. They would have mourned and might have been grief-stricken, but they saw death all around them.”
In that sense, death as an event is increasing in its importance, which conversely means that the value of human life is also rising. In economics, a commodity is more valuable the rarer it is, which is why a finite resource such as oil can fetch top dollar. Human life, measured as time on Earth, works the same way—but it also doesn’t. If we can expect to live a long time, we may not treasure individual years as much. We might even waste time by indulging in extraneous pursuits, such as sailing around the world or mastering the ukulele. On the other hand, if we live for many years, the value we have to other people, such as friends and family members, tends to increase.
Life differs from most commodities in the value it has for the person possessing it. If an individual has some oil but doesn’t like it, he or she can sell it for a nice profit because other people or entities do value it. An individual’s life, however, doesn’t have the same transferable value. Relatives and loved ones may treasure your life, but ultimately it isn’t worth much if you don’t yourself, which is where quality comes in.
At the beginning of Interview with the Vampire, Lestat is a confident and happy undead monster in late 19th-century New Orleans. He creates a vampire family of sorts by siring an aristocrat named Louis and then a young girl named Claudia. They live together happily for a while, but eventually his proteges turn on him and flee. Near the end of the book, in modern times, Lestat is living in squalor, barely alive. Despite his immortality and the automatic fulfillment of his biological baseline—as long as he drinks blood—he’s miserable after years of being alone with the memory of his family’s rejection. The message is clear: It’s not enough to simply exist.
Most vampire fiction falls firmly within the realm of horror, but Rice’s books read more like psychological case studies; the vampire characters comment on the human effects of technology and progress. With technology extending our lives and improving our health, the analogy fits better today than ever before: Humans may not be vicious psychopaths who drink the blood of innocents, but we are becoming more akin to vampires in that way. Age affects Rice’s characters in different ways: Some become wise and contented while others grow vain and egotistical. Which path are we treading as we inch toward immortality?
The article is excerpted from Peter Nowak's Humans 3.0: The Upgrading of the Species.