If you'll excuse me, the rest of us will be over here in the corner, freaking out a little bit.
It's tempting, of course, to blame this on Obamacare, and I certainly wouldn't rule out the possibility that this is at least part of the explanation. But there's no evidence that this is the case, other than the crude time correlation. Derek Lowe suggests that the answer is simpler: the return to R&D spending in the industry has been falling for a long time, as many therapeutic areas are crowded with generics (or soon-to-be generics) that already do a very good job. The remaining areas (cancer, central nervous system, obesity) turn out to be very tough, and there's no guarantee that we'll ever find pharmaceutical interventions that do what we want.
There's an interesting discussion to be had about whether the market outcome differs from the socially optimal outcome--whether falling returns to pharmaceutical R&D mean that we should be putting more resources into it, or fewer. But I'll leave that aside for the nonce, because I'm not sure what I think, and talk about what this means for the rest of the health care system.
You might initially think that this is good news for cost control--the expensive brand name drugs will all go generic, and we'll save a bunch of money on prescription drugs. And indeed, this is absolutely true. But this will have repercussions for other areas of health care, and those repercussions are not good. While some drugs are simply an added expense (think chemotherapy prolonging the lives of people who would otherwise have died sooner), many of the real blockbusters substitute for labor-intensive treatment. Statins instead of cardiac catheterizations or coronary bypasses. Avandia instead of amputations. Hydrochlorothiazide instead of nursing home care for your massive stroke.
We'll still have all those drugs, of course. But with less R&D, we'll presumably see fewer pharmaceutical substitutes for the expensive conditions we still spend a lot of money treating, like Alzheimer's. Which means that health care expenses might actually rise faster than we expect. Most of the rest of the health care system is subject to a phenomenon known as Baumol's Cost Disease, which I described thusly a few years ago:
. . . medical productivity doesn't improve as fast as most of the rest of the economy--basically, activities that are very labor intensive don't tend to have massive productivity gains. That's why it still takes just about as many teachers to teach 50,000 sixth graders as it did fifty years ago. Similarly, it still takes one person to give you a sponge bath and administer your pills.
There's a caveat however: antibiotic resistance. Without antibiotics, a lot of our heroic interventions become a lot more deadly, which means less likely to be performed. And someone who has died from an infection following their coronary bypass will not be around to demand an expensive knee replacement.
Unfortunately, that's not exactly ac omforting thought.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The Fourth of July—a time we Americans set aside to celebrate our independence and mark the war we waged to achieve it, along with the battles that followed. There was the War of 1812, the War of 1833, the First Ohio-Virginia War, the Three States' War, the First Black Insurrection, the Great War, the Second Black Insurrection, the Atlantic War, the Florida Intervention.
Confused? These are actually conflicts invented for the novel The Disunited States of Americaby Harry Turtledove, a prolific (and sometimes-pseudonymous) author of alternate histories with a Ph.D. in Byzantine history. The book is set in the 2090s in an alternate United States that is far from united. In fact, the states, having failed to ratify a constitution following the American Revolution, are separate countries that oscillate between cooperating and warring with one another, as in Europe.
The executive producer of Masterpiece says Jane Austen works a lot better on screen than Hemingway does.
For 44 years, PBS’s Masterpiece franchise has brought high-end Britain TV programs to American audiences. While the ultra-successful Downton Abbey comes from an original screenplay, many of Masterpiece’s shows over the years have been adapted from great works of literature. And the vast majority of those great works of literature, unsurprisingly, have been British.
But every so often, an American novel—like James Agee’s A Death in the Family or Willa Cather’s The Song of the Lark—has been turned into a Masterpiece. On Friday at the Aspen Ideas Festival, Rebecca Eaton, the longtime executive producer of Masterpiece, said she wished that the program had tackled more U.S. authors over the years. “The reasons that we haven't are twofold,” she said. “One is money, the second is money. And the third is money. Also, the dark nature of American literature, which is something to think about for a moment."
How a re-creation of its most famous battle helped erase the meaning of the Civil War.
"No person should die without seeing this cyclorama," declared a Boston man in 1885. "It's a duty they owe to their country." Paul Philippoteaux's lifelike depiction of the Battle of Gettysburg was much more than a painting. It re-created the battlefield with such painstaking fidelity, and created an illusion so enveloping, that many visitors felt as if they were actually there.
For all its verisimilitude, though, the painting failed to capture the deeper truths of the Civil War. It showed the two armies in lavish detail, but not the clash of ideals that impelled them onto the battlefield. Its stunning rendition of a battle utterly divorced from context appealed to a nation as eager to remember the valor of those who fought as it was to forget the purpose of their fight. Its version of the conflict proved so alluring, in fact, that it changed the way America remembered the Civil War.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
The meaning of the Confederate flag is best discerned in the words of those who bore it.
This afternoon, in announcing her support for removing the Confederate flag from the capitol grounds, South Carolina Governor Nikki Haley asserted that killer Dylann Roof had “a sick and twisted view of the flag” which did not reflect “the people in our state who respect and in many ways revere it.” If the governor meant that very few of the flag’s supporters believe in mass murder, she is surely right. But on the question of whose view of the Confederate Flag is more twisted, she is almost certainly wrong.
Roof’s belief that black life had no purpose beyond subjugation is “sick and twisted” in the exact same manner as the beliefs of those who created the Confederate flag were “sick and twisted.” The Confederate flag is directly tied to the Confederate cause, and the Confederate cause was white supremacy. This claim is not the result of revisionism. It does not require reading between the lines. It is the plain meaning of the words of those who bore the Confederate flag across history. These words must never be forgotten. Over the next few months the word “heritage” will be repeatedly invoked. It would be derelict to not examine the exact contents of that heritage.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.