I found it very odd to see Paul Krugman complaining that "patients are not consumers" as if "consumer" were some sort of horrible, low-status role that should never taint the sacred realm of health care. In my economics classes, "consumer" was not a value judgement; it was a descriptor. A consumer is someone who consumes, just as a producer is someone who produces and a distributor is someone who distributes. So I was a bit befuddled to see an economist arguing that "The idea that all this can be reduced to money -- that doctors are just "providers" selling services to health care "consumers" -- is, well, sickening. And the prevalence of this kind of language is a sign that something has gone very wrong not just with this discussion, but with our society's values." Patients consume health care resources. Providers provide them. And the system through which labor and resources are allocated in our society remains money--an arrangement that I'm pretty sure that Paul Krugman doesn't want to change.
This semantic moralizing takes away from what I do think is the core argument between the partisans of the "Peoples' Budget" and the advocates of Ryan's Medicare voucher plan: whether consumers patients, or a central committee (IPAB) should be in charge of deciding what to do with limited health care resources. Paul Krugman, unsurprisingly, is against putting consumers in control:
"Consumer-based" medicine has been a bust everywhere it has been tried. Medicare Advantage was supposed to save money; it ended up costing substantially more than traditional Medicare. America has the most "consumer-driven" health care system in the advanced world. It also has by far the highest costs yet provides a quality of care no better than far cheaper systems in other countries.
But the fact that Republicans are demanding that we stake our health on a failed approach is only part of what's wrong. As I said earlier, there's something wrong with the whole notion of patients as "consumers" and health care as simply a financial transaction.
Medical care, after all, is an area in which crucial decisions must be made. Yet making such decisions intelligently requires a vast amount of specialized knowledge.
Furthermore, those decisions often must be made under conditions in which the patient is incapacitated, under severe stress or needs action immediately, with no time for discussion, let alone comparison shopping.
The statistics with which he opens are dubious: Medicare Advantage is more expensive because it provides more benefits, and the US isn't even close to being the leader in consumer-driven medicine, if by that you mean cost-sharing and purchasing decisions; in the rich world, that would almost certainly be Switzerland, where consumers patients not only pay heavily out of pocket, but purchase their own insurance, as both Kaiser and Cato will tell you.
But though Krugman may be wrong about how consumer-driven our system is, he's not wrong that this is a core conflict. Nor do I think he's wrong that patients will frequently decide wrong. Where Krugman and I differ is that I don't think that centralized rule making is going to do such a super job either, for two reasons.
The first is that providers and patients are going to fight cuts with every fiber of their being, and they will find it easier to fight on individual procedures than on increasing the size of the health care voucher; the former is not very expensive for any given procedure, while the latter is a large, obvious whack in the pocketbook for taxpayers. Think of how easy it has been for oxygen providers to keep their Medicare reimbursements--and how hard it was to pass a new health care entitlement.
But the second is that while consumers may be stupid, rules are often stupid too. Evidence-based medicine is certainly a good idea, but we are nowhere near being able to generate solid rules that a) cover all major possibilities and b) provide the highest chance of survival for the money. People are incredibly complicated. This makes outcomes hard to measure--and solid guidelines hard to develop. Drugs are the most intensively tested health care treatments we have, with the sort of rigorously controlled, double-blind studies that you need to get significant results. But we don't do nearly as much testing as we should: too little head-to-head testing of various products, and far too little testing that could distinguish sub-populations which benefit most from a given drug. It's common to blame pharmaceutical companies' financial incentives, and that's part of it, which is why I support having the government do more head-to-head testing. But that's far from the only limitation. The biggest limitation is often finding enough patients with a given disease to produce statistically significant results. The more satisfied patients are with their current treatments, the harder it is to test whether those treatments are effective.
But even if we had the kind of data we'd need to develop a comprehensive set of rules, the problem remains: rules are stupid. You need to leave room for individual discretion. And individual discretion on the part of doctors and hospitals is a loophole you could drive a truck through.
Nor do I think the possibility of reducing costs through individual discretion is quite as impossible as Krugman makes things sound. Sure, a lot of decisions are life-or-death last minute things. But a lot of them aren't. They're questions like, "Do we send grandma to a nursing home, or try to keep her in the spare bedroom with the help of a home health-care aide?" Or "I've got stage four breast cancer with bone metastes; should I really mortgage the house to try another round of chemo?"
It's all very well to say that people shouldn't have to make those decisions on the basis of money. But that's all the government is going to do. Sure, there are some procedures that people just shouldn't have (like a lot of back surgery). But a lot of this is value judgements: hip replacements for elderly patients, expensive chemotherapy that may extend life by a few months, more convenient dosing schedules or better side-effect profiles for brand name drugs. Unless we simply rely on across-the-board reimbursement cuts--which would be moronic on every level--the government is mostly not going to be deciding which treatments are effective; it's going to be deciding which treatments are cost-effective. We haven't taken doctors out of the business of selling health care to patients; we've just added a middleman.
Now, maybe you think that the government is smarter than the consumers it's speaking for. But how does the government know what you value most: an extra three months of life when you have cancer, or an extra five years of walking after age 89, or an extra $4,000 right now?
I think that people who favor a central board probably put more faith in technocrats than I do, but also, that they are horrified by the specificity of the choices. They're comfortable making decisions about who lives or who dies when the people in those decisions are just decimal points in an aggregate statistic. But they find it horrifying that anyone--particularly the patient--should have to make that decision about a specific person.
But to me, they're not really that different. All those decimal points are people too. And it's just as heart-rending when they suffer or die.
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Neuroscientist James Fallon discovered through his work that he has the brain of a psychopath, and subsequently learned a lot about the role of genes in personality and how his brain affects his life.
In 2005, James Fallon's life started to resemble the plot of a well-honed joke or big-screen thriller: A neuroscientist is working in his laboratory one day when he thinks he has stumbled upon a big mistake. He is researching Alzheimer's and using his healthy family members' brain scans as a control, while simultaneously reviewing the fMRIs of murderous psychopaths for a side project. It appears, though, that one of the killers' scans has been shuffled into the wrong batch.
The scans are anonymously labeled, so the researcher has a technician break the code to identify the individual in his family, and place his or her scan in its proper place. When he sees the results, however, Fallon immediately orders the technician to double check the code. But no mistake has been made: The brain scan that mirrors those of the psychopaths is his own.
Neither the Islamic State nor al-Qaeda would be where they are today without Abu Abdullah al-Muhajir, who was recently killed in an American airstrike.
Last year, the Islamic State released a training video, one of a multipart series shot in Iraq. With its scenes of foot drills, target practice, and karate chops, it would have been entirely unremarkable were it not for a short classroom scene, in which an instructor walks viewers through the ideological curriculum forced upon new recruits to the ISIS cause. As he’s shown reeling off a list of some key topics in jihadist jurisprudence, one can glimpse a thick volume resting atop each of the 20 or so schoolroom desks—a manuscript that, while few would recognize it outside of jihadist circles, is instrumental to ISIS as a theological playbook that is used to justify the group’s most abhorrent acts.
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call with the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Sturm and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Trump's election has reopened questions that have long seemed settled in America—including the acceptability of open discrimination against minority groups.
When Stephen Bannon called his website, Breitbart, the “platform for the alt-right” this summer, he was referring to a movement that promotes white nationalism and argues that the strength of the United States is tied to its ethnic European roots. Its members mostly stick to trolling online, but much of what they do isn’t original or new: Their taunts often involve vicious anti-Semitism. They make it clear that Jews are not included in their vision of a perfect, white, ethno-state.
On the opposite side of American politics, many progressive groups are preparing to mount a rebellion against Donald Trump. They see solidarity among racial minorities as their goal, and largely blame Trump’s election on racism and white supremacy. Three-quarters of American Jews voted against Trump, and many support this progressive vision. Some members of these groups, though, have singled out particular Jews for their collusion with oppressive power—criticisms which range from inflammatory condemnations of Israel to full-on conspiracies about global Jewish media and banking cabals.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
The Republican presidential candidate is not a Fascist, but his campaign bears notable similarities to the reign of Italian dictator Benito Mussolini.
Fascism has been back in the news with Donald Trump’s candidacy for the American presidency. His populist claim to speak for the white everyman, along with his menacing leadership style, have brought forth comparisons among this “homegrown authoritarian,” as President Barack Obama has called Trump, and foreign strongmen.
Trump is not a Fascist. He does not aim to establish a one-party state. Yet he has created a one-man-led political movement that does not map onto traditional U.S. party structures or behave in traditional ways. This is how Fascism began as well.
A century before Trump, Benito Mussolini burst onto the Italian political scene, confounding the country’s political establishment with his unorthodox doctrine and tactics and his outsized personality. Mussolini’s rise offers lessons for understanding the Trump phenomenon—and why he was able to disarm much of the American political class.