I found it very odd to see Paul Krugman complaining that "patients are not consumers" as if "consumer" were some sort of horrible, low-status role that should never taint the sacred realm of health care. In my economics classes, "consumer" was not a value judgement; it was a descriptor. A consumer is someone who consumes, just as a producer is someone who produces and a distributor is someone who distributes. So I was a bit befuddled to see an economist arguing that "The idea that all this can be reduced to money -- that doctors are just "providers" selling services to health care "consumers" -- is, well, sickening. And the prevalence of this kind of language is a sign that something has gone very wrong not just with this discussion, but with our society's values." Patients consume health care resources. Providers provide them. And the system through which labor and resources are allocated in our society remains money--an arrangement that I'm pretty sure that Paul Krugman doesn't want to change.
This semantic moralizing takes away from what I do think is the core argument between the partisans of the "Peoples' Budget" and the advocates of Ryan's Medicare voucher plan: whether consumers patients, or a central committee (IPAB) should be in charge of deciding what to do with limited health care resources. Paul Krugman, unsurprisingly, is against putting consumers in control:
"Consumer-based" medicine has been a bust everywhere it has been tried. Medicare Advantage was supposed to save money; it ended up costing substantially more than traditional Medicare. America has the most "consumer-driven" health care system in the advanced world. It also has by far the highest costs yet provides a quality of care no better than far cheaper systems in other countries.
But the fact that Republicans are demanding that we stake our health on a failed approach is only part of what's wrong. As I said earlier, there's something wrong with the whole notion of patients as "consumers" and health care as simply a financial transaction.
Medical care, after all, is an area in which crucial decisions must be made. Yet making such decisions intelligently requires a vast amount of specialized knowledge.
Furthermore, those decisions often must be made under conditions in which the patient is incapacitated, under severe stress or needs action immediately, with no time for discussion, let alone comparison shopping.
The statistics with which he opens are dubious: Medicare Advantage is more expensive because it provides more benefits, and the US isn't even close to being the leader in consumer-driven medicine, if by that you mean cost-sharing and purchasing decisions; in the rich world, that would almost certainly be Switzerland, where consumers patients not only pay heavily out of pocket, but purchase their own insurance, as both Kaiser and Cato will tell you.
But though Krugman may be wrong about how consumer-driven our system is, he's not wrong that this is a core conflict. Nor do I think he's wrong that patients will frequently decide wrong. Where Krugman and I differ is that I don't think that centralized rule making is going to do such a super job either, for two reasons.
The first is that providers and patients are going to fight cuts with every fiber of their being, and they will find it easier to fight on individual procedures than on increasing the size of the health care voucher; the former is not very expensive for any given procedure, while the latter is a large, obvious whack in the pocketbook for taxpayers. Think of how easy it has been for oxygen providers to keep their Medicare reimbursements--and how hard it was to pass a new health care entitlement.
But the second is that while consumers may be stupid, rules are often stupid too. Evidence-based medicine is certainly a good idea, but we are nowhere near being able to generate solid rules that a) cover all major possibilities and b) provide the highest chance of survival for the money. People are incredibly complicated. This makes outcomes hard to measure--and solid guidelines hard to develop. Drugs are the most intensively tested health care treatments we have, with the sort of rigorously controlled, double-blind studies that you need to get significant results. But we don't do nearly as much testing as we should: too little head-to-head testing of various products, and far too little testing that could distinguish sub-populations which benefit most from a given drug. It's common to blame pharmaceutical companies' financial incentives, and that's part of it, which is why I support having the government do more head-to-head testing. But that's far from the only limitation. The biggest limitation is often finding enough patients with a given disease to produce statistically significant results. The more satisfied patients are with their current treatments, the harder it is to test whether those treatments are effective.
But even if we had the kind of data we'd need to develop a comprehensive set of rules, the problem remains: rules are stupid. You need to leave room for individual discretion. And individual discretion on the part of doctors and hospitals is a loophole you could drive a truck through.
Nor do I think the possibility of reducing costs through individual discretion is quite as impossible as Krugman makes things sound. Sure, a lot of decisions are life-or-death last minute things. But a lot of them aren't. They're questions like, "Do we send grandma to a nursing home, or try to keep her in the spare bedroom with the help of a home health-care aide?" Or "I've got stage four breast cancer with bone metastes; should I really mortgage the house to try another round of chemo?"
It's all very well to say that people shouldn't have to make those decisions on the basis of money. But that's all the government is going to do. Sure, there are some procedures that people just shouldn't have (like a lot of back surgery). But a lot of this is value judgements: hip replacements for elderly patients, expensive chemotherapy that may extend life by a few months, more convenient dosing schedules or better side-effect profiles for brand name drugs. Unless we simply rely on across-the-board reimbursement cuts--which would be moronic on every level--the government is mostly not going to be deciding which treatments are effective; it's going to be deciding which treatments are cost-effective. We haven't taken doctors out of the business of selling health care to patients; we've just added a middleman.
Now, maybe you think that the government is smarter than the consumers it's speaking for. But how does the government know what you value most: an extra three months of life when you have cancer, or an extra five years of walking after age 89, or an extra $4,000 right now?
I think that people who favor a central board probably put more faith in technocrats than I do, but also, that they are horrified by the specificity of the choices. They're comfortable making decisions about who lives or who dies when the people in those decisions are just decimal points in an aggregate statistic. But they find it horrifying that anyone--particularly the patient--should have to make that decision about a specific person.
But to me, they're not really that different. All those decimal points are people too. And it's just as heart-rending when they suffer or die.
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
Which is a different way of asking: Can a bot commit libel?
Facebook set a new land-speed record for situational irony this week, as it fired the people who kept up its “Trending Topics” feature and replaced them with an algorithm on Friday, only to find the algorithm promoting completely fake news on Sunday.
Rarely in recent tech history has a downsizing decision come back to bite the company so publicly and so quickly.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
Like a little white Lazarus with red eyes, the paralyzed mouse was walking again.
A few days earlier, the mouse had been sprawled on an operating table while two Chinese graduate students peered through a microscope and operated on its spine. With a tiny pair of scissors, they removed the top half of a fingernail-thin vertebra, exposing a gleaming patch of spinal-cord tissue. It looked like a Rothko, a clean ivory rectangle bisected by a red line. Cautiously—the mouse occasionally twitched—they snipped the red line (an artery) and tied it off. Then one student reached for a $1,000 scalpel with a diamond blade so thin that it was transparent. With a quick slice of the spinal cord, the mouse’s back legs were rendered forever useless.
Many asset-management companies fear a program that would reduce something they depend on: consumers’ confusion.
Today, half of American households have exactly zero retirement savings, not counting traditional pension plans, which are becoming ever less common, or Social Security. There are two basic reasons for this distressing state of affairs. The first is that many families don’t make enough to cover their basic living expenses. The second is that even people who could put money aside often don’t have easy access to retirement savings programs—which is particularly the case for workers whose employers don’t offer any kind of retirement plan.
To address this second problem, several states are experimenting with public programs that automatically enroll employees in a retirement plan if their employer doesn’t offer one. California’s plan (which still must be finalized after different versions passed the two houses of the state legislature last week) would automatically cover anyone who works at a company with five or more employees. By default, each participant would save 3 percent of her income, but workers would have the choice to change their contribution percentage or to opt out altogether.
The 49ers quarterback’s decision to sit during the national anthem is being framed by some as an affront to the American military.
In a recent episode of Hard Knocks, an HBO series that follows one team a year through the rigors of an NFL training camp, the Los Angeles Rams head coach Jeff Fisher called a team-wide meeting that covered the protocol for the national anthem. Fisher gravely and emphatically explained the rules to the roughly 60 assembled men. Helmets belong under the left arm, he declared, and feet on the white of the sideline. “It’s a respect thing,” he said. “It’s a self-respect thing, it’s respect for your teammates, it’s respect for this game, and it’s respect for this country.” Fisher proceeded to show the group footage of a past Rams team following the procedures and, turning to face the screen himself in the silence of the room, said, “That’s how you start a game.”
A look back at one of Gene Wilder’s most memorable roles, in a film that is as much about technology as it is about childhood
The lesson you learn right away, when you are a small child who has devoured a heap of Roald Dahl books, is that childhood is dark and dangerous—and yet still an adventure worth taking. In Dahl’s simultaneously sinister and gloriumptious worlds, to use one of his many invented adjectives, breaking the rules can yield both great rewards and terrible punishment.
Navigating this not-always-straightforward relationship between what people deserve and what they get is part of growing up. It’s also a central theme in one of Dahl’s most beloved books, Charlie and the Chocolate Factory, and an idea explored thoroughly by Willy Wonka, the quirky candy maker at the center of the story.
Gene Wilder, in his outstanding portrayal of Wonka in the 1971 adaptation of Dahl’s 1964 tale, captures this theme by oscillating between sincerity and deadpan sarcasm with unnerving grace. Wilder, who died Monday morning at age 83, was so well suited for the role that his Wonka seems to have sprung to the silver screen directly from Dahl’s mind. (It’s somewhat disorienting, then, to return to Dahl’s physical description of Wonka as a little man with a black goatee and quick squirrel-like movements—none of which is evident in Wilder’s portrayal—though Wilder exactly fits Dahl’s version of a Wonka with blue eyes “marvelously bright... sparkling and twinkling at the same time.”)
Paul LePage suggested he might resign amidst an uproar that began when he blamed blacks and Hispanics for his state’s heroin epidemic and endorsed racial profiling.
For years, it seemed like no outrageous remark was too far for Paul LePage. That is, there was practically nothing he would not say; and there was no indication that his ever more erratic remarks carried a political cost. But now the Maine governor may have pushed his luck too far.
During a radio interview Tuesday morning, LePage implied that he might resign. “I’m looking at all options,” he said. “I think some things I’ve been asked to do are beyond my ability. I’m not going to say that I’m not going to finish it. I’m not saying that I am going to finish it.”
It’s a remarkable moment for the Republican, who has made his reputation by offering up outlandish and often plainly offensive comments. The story began in January, when LePage complained that “guys by the name D-Money, Smoothie, Shifty … come from Connecticut and New York. They come up here, they sell their heroin, then they go back home. Incidentally, half the time they impregnate a young, white girl before they leave.”
The meaning of HBO’s hypnotic miniseries lay in its characters’ eyes.
One of the most memorable images of The Night Of, the now-concluded HBO miniseries that seemed only to ever deal in memorable images, was among its simplest. In Sunday’s finale, the lawyer John Stone (John Turturro) presented his client Nasir Khan (Riz Ahmed) with the difficult decision of whether to report his other lawyer, Chandra (Amara Karan), for kissing Naz. Doing so could result in a mistrial—which could be a good thing for Naz, but would ruin Chandra’s career.
Naz said almost nothing as he listened to Stone. But his eyes were steadily focused, glassy, reflecting the white light of a window across the room. “What do you care, you like her like you like Andrea?” Stone asked, referring to Chandra and the woman Naz is accused of killing. He told Naz to think about looking in the mirror, 20 years from now, regretting his choice today.
As pay TV slowly declines, cable news faces a demographic cliff. And nobody has further to fall than the merchant of right-wing outrage.
Updated at 12:05 p.m.
October 7, 2016, will be the 20th birthday of the Fox News Channel, and at the moment, the network is experiencing the soap-operatic highs and lows typical of any teenager on television. In many ways, the summer of 2016 may go down in Fox News history as the company’s nadir. Its founder and leader Roger Ailes has been dishonorably dispatched, the remaining executives are dealing with a flurry of sexual harassment lawsuits, and one of its most public faces, Sean Hannity, has ignominiously remodeled himself as a gutless Trump whisperer.
And yet Fox News’ fortunes are ascendant, at least in the most quantifiable sense. The network’s annual profit in 2015 soared by about 20 percent. For the first time ever, Fox News has been the most-watched cable network among both primetime and daytime viewers for several months, with a larger audience than its nominal rivals, CNN and MSNBC, combined. Led by “The O'Reilly Factor,” Fox News doesn’t just have the best-rated news show on cable television; according to The Wrap, it has the 13 best-rated news shows on cable television.