I found it very odd to see Paul Krugman complaining that "patients are not consumers" as if "consumer" were some sort of horrible, low-status role that should never taint the sacred realm of health care. In my economics classes, "consumer" was not a value judgement; it was a descriptor. A consumer is someone who consumes, just as a producer is someone who produces and a distributor is someone who distributes. So I was a bit befuddled to see an economist arguing that "The idea that all this can be reduced to money -- that doctors are just "providers" selling services to health care "consumers" -- is, well, sickening. And the prevalence of this kind of language is a sign that something has gone very wrong not just with this discussion, but with our society's values." Patients consume health care resources. Providers provide them. And the system through which labor and resources are allocated in our society remains money--an arrangement that I'm pretty sure that Paul Krugman doesn't want to change.
This semantic moralizing takes away from what I do think is the core argument between the partisans of the "Peoples' Budget" and the advocates of Ryan's Medicare voucher plan: whether consumers patients, or a central committee (IPAB) should be in charge of deciding what to do with limited health care resources. Paul Krugman, unsurprisingly, is against putting consumers in control:
"Consumer-based" medicine has been a bust everywhere it has been tried. Medicare Advantage was supposed to save money; it ended up costing substantially more than traditional Medicare. America has the most "consumer-driven" health care system in the advanced world. It also has by far the highest costs yet provides a quality of care no better than far cheaper systems in other countries.
But the fact that Republicans are demanding that we stake our health on a failed approach is only part of what's wrong. As I said earlier, there's something wrong with the whole notion of patients as "consumers" and health care as simply a financial transaction.
Medical care, after all, is an area in which crucial decisions must be made. Yet making such decisions intelligently requires a vast amount of specialized knowledge.
Furthermore, those decisions often must be made under conditions in which the patient is incapacitated, under severe stress or needs action immediately, with no time for discussion, let alone comparison shopping.
The statistics with which he opens are dubious: Medicare Advantage is more expensive because it provides more benefits, and the US isn't even close to being the leader in consumer-driven medicine, if by that you mean cost-sharing and purchasing decisions; in the rich world, that would almost certainly be Switzerland, where consumers patients not only pay heavily out of pocket, but purchase their own insurance, as both Kaiser and Cato will tell you.
But though Krugman may be wrong about how consumer-driven our system is, he's not wrong that this is a core conflict. Nor do I think he's wrong that patients will frequently decide wrong. Where Krugman and I differ is that I don't think that centralized rule making is going to do such a super job either, for two reasons.
The first is that providers and patients are going to fight cuts with every fiber of their being, and they will find it easier to fight on individual procedures than on increasing the size of the health care voucher; the former is not very expensive for any given procedure, while the latter is a large, obvious whack in the pocketbook for taxpayers. Think of how easy it has been for oxygen providers to keep their Medicare reimbursements--and how hard it was to pass a new health care entitlement.
But the second is that while consumers may be stupid, rules are often stupid too. Evidence-based medicine is certainly a good idea, but we are nowhere near being able to generate solid rules that a) cover all major possibilities and b) provide the highest chance of survival for the money. People are incredibly complicated. This makes outcomes hard to measure--and solid guidelines hard to develop. Drugs are the most intensively tested health care treatments we have, with the sort of rigorously controlled, double-blind studies that you need to get significant results. But we don't do nearly as much testing as we should: too little head-to-head testing of various products, and far too little testing that could distinguish sub-populations which benefit most from a given drug. It's common to blame pharmaceutical companies' financial incentives, and that's part of it, which is why I support having the government do more head-to-head testing. But that's far from the only limitation. The biggest limitation is often finding enough patients with a given disease to produce statistically significant results. The more satisfied patients are with their current treatments, the harder it is to test whether those treatments are effective.
But even if we had the kind of data we'd need to develop a comprehensive set of rules, the problem remains: rules are stupid. You need to leave room for individual discretion. And individual discretion on the part of doctors and hospitals is a loophole you could drive a truck through.
Nor do I think the possibility of reducing costs through individual discretion is quite as impossible as Krugman makes things sound. Sure, a lot of decisions are life-or-death last minute things. But a lot of them aren't. They're questions like, "Do we send grandma to a nursing home, or try to keep her in the spare bedroom with the help of a home health-care aide?" Or "I've got stage four breast cancer with bone metastes; should I really mortgage the house to try another round of chemo?"
It's all very well to say that people shouldn't have to make those decisions on the basis of money. But that's all the government is going to do. Sure, there are some procedures that people just shouldn't have (like a lot of back surgery). But a lot of this is value judgements: hip replacements for elderly patients, expensive chemotherapy that may extend life by a few months, more convenient dosing schedules or better side-effect profiles for brand name drugs. Unless we simply rely on across-the-board reimbursement cuts--which would be moronic on every level--the government is mostly not going to be deciding which treatments are effective; it's going to be deciding which treatments are cost-effective. We haven't taken doctors out of the business of selling health care to patients; we've just added a middleman.
Now, maybe you think that the government is smarter than the consumers it's speaking for. But how does the government know what you value most: an extra three months of life when you have cancer, or an extra five years of walking after age 89, or an extra $4,000 right now?
I think that people who favor a central board probably put more faith in technocrats than I do, but also, that they are horrified by the specificity of the choices. They're comfortable making decisions about who lives or who dies when the people in those decisions are just decimal points in an aggregate statistic. But they find it horrifying that anyone--particularly the patient--should have to make that decision about a specific person.
But to me, they're not really that different. All those decimal points are people too. And it's just as heart-rending when they suffer or die.
The staggering scope of the country’s infrastructure initiative—and what it means for the international order
The Pakistani town of Gwadar was until recently filled with the dust-colored cinderblock houses of about 50,000 fishermen. Ringed by cliffs, desert, and the Arabian Sea, it was at the forgotten edge of the earth. Now it’s one centerpiece of China’s “Belt and Road” initiative, and the town has transformed as a result. Gwadar is experiencing a storm of construction: a brand-new container port, new hotels, and 1,800 miles of superhighway and high-speed railway to connect it to China’s landlocked western provinces. China and Pakistan aspire to turn Gwadar into a new Dubai, making it a city that will ultimately house 2 million people.
China is quickly growing into the world’s most extensive commercial empire. By way of comparison, after World War II, the Marshall Plan provided the equivalent of $800 billion in reconstruction funds to Europe (if calculated as a percentage of today’s GDP). In the decades after the war the United States was also the world’s largest trading nation, and its largest bilateral lender to others.
Ivana Trump’s new book is a parenting memoir—and an ode to being better than everyone else.
There’s a story Ivana Trump tells in Raising Trump, her new memoir of parenting, work, and marriage. It was New Year’s Eve, 1977; she and Donald Trump were together in the hospital room after their first child had been born, discussing the matter of what name to give their new infant. Ivana suggested that the son should be named after the father: Donald Trump Jr. Donald, however, balked at this.
“What if he’s a loser?” he said.
Ivana got her way, in this instance as in many she describes in Raising Trump, which begins and ends with the premise that none of the three children Ivana and Donald Trump created together have been consigned to a life of loserdom. The book may be a parenting memoir; it may feature practical tips about punishments and allowances and the compulsory writing of thank-you notes; it may even feature a curated selection of awkward family photos and treasured family recipes; but it is about parenting, as most people practice it, in only the most superficial sense. By virtue of its core characters—a man who becomes the American president, a daughter who becomes his advisor, a son-in-law who becomes responsible for criminal justice reform and opioid crisis managementand bringing peace to the Middle East—Raising Trump is less a straightforward memoir than it is a teasing exploration of the workings of the presidential family. Here are the oft-discussed “Trump family values,” as explained by the woman who helped to create them.
We spend two full years of our lives washing ourselves. How much of that time (and money and water) is a waste?
12,167 hours of washing our bodies.
That’s how much life you use, if you spend 20 minutes per day washing and moisturizing your skin and hair (and you live to be 100, as we all surely will).
That adds up to nearly two entire years of washing every waking hour.
Not to mention water usage and the cost of cosmetic products—which we need, because commercials tell us to remove the oil from our skin with soap, and then to moisturize with lotion. Other commercials tell us to remove the oils from our hair, and then moisturize with conditioner.
That’s four products—plus a lot of water and time— and few people question whether it’s anything short of necessary.
It’s not just the fault of advertising, but also because most of us know from personal experience that if we go a few days without showering, even one day, we become oily, smelly beasts.
In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.
Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.
Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
The director is blaming the critical aggregator for dooming more complex films, but the deeper problem is studio neglect.
Last weekend, Professor Marston and the Wonder Women, a drama about the creator of the famed comic-book character, became the latest mid-budget casualty. It was marketed on the back of its connection with Wonder Woman, one of the biggest hits of the year. It received a moderately wide release and got strong reviews, but its three-day box-office total was just $736,883—a flimsy average of $600 per theater, which essentially doomed any future chance of success. Critics and industry insiders alike have lamented for years the decline of modestly budgeted movies aimed at grownups, the sort of film that was once the backbone of Hollywood.
Professor Marston would likely have at least one sympathizer in Martin Scorsese, who recently wrote an op-ed for The Hollywood Reporter on how many good, artistic movies are struggling to find receptive audiences in this new era for the industry. “Box office is the undercurrent in almost all discussions of cinema, and frequently it’s more than just an undercurrent,” said the Academy Award-winning director, who also works tirelessly in the field of film preservation. Indeed, in most cases, a movie is judged a flop or a hit within the first few days of its release. Box-office prognosticators can predict a film’s final grosses almost immediately, and there’s very little chance for word-of-mouth to help build up hype, except in the cases of certain smaller independent works.
About 10 years ago, after I’d graduated college but when I was still waitressing full-time, I attended an empowerment seminar. It was the kind of nebulous weekend-long event sold as helping people discover their dreams and unburden themselves from past trauma through honesty exercises and the encouragement to “be present.” But there was one moment I’ve never forgotten. The group leader, a man in his 40s, asked anyone in the room of 200 or so people who’d been sexually or physically abused to raise their hands. Six or seven hands tentatively went up. The leader instructed us to close our eyes, and asked the question again. Then he told us to open our eyes. Almost every hand in the room was raised.
A small group of programmers wants to change how we code—before catastrophe strikes.
There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.
The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.
The last seventy-five years of American foreign policy are not the story of a country consistently pursuing democratic ideals, only to see them undermined now by a fearful “blood and soil” isolationism.
Being a liberal in the Donald Trump era is tricky. On the one hand, you’re grateful for any conservative who denounces the president’s authoritarian lies. On the other, you can’t help but notice that many of the conservatives who condemn Trump most passionately—Bill Kristol, Bret Stephens, Michael Gerson, Jennifer Rubin—remain wedded to the foreign policy legacy of George W. Bush. And in criticizing Trump’s amoral “isolationism,” they backhandedly defend the disastrous interventionism that helped produce his presidency in the first place.
The godfather of this brand of hawkish, anti-Trump conservatism is John McCain. Sure, McCain—being a Republican Senator—doesn’t condemn Trump as forthrightly as his “neoconservative” allies in the press. But the terms of his critique are similar.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.