There's a lot to like about this Jonathan Chait retrospective on Michael Bloomberg. Chait's main target is the insane idea that Bloomberg could ever have run for president and won. This notion rests on the idea that Bloomberg is a "centrist" when, in fact, his politics are basically the politics of the Democratic Party. If you can articulate the difference between Michael Bloomberg's politics and, say, Chuck Schumer's or Cory Booker's, I'd love to hear it. The idea of Bloomberg as a "centrist" savior rests on the premise that somewhere in the Senate there is liberal version of Ted Cruz.
But there's something else here that's more telling. Chait quotes David Broder asserting that Bloomberg should run because:
... there is a palpable hunger among the public for someone who will attack the problems facing the country -- the war in Iraq, immigration, energy, health care -- and not worry about the politics.
This is an amazing statement, but it's of a piece with Bloomberg's contention that "people aren't good at describing what is in their own interest." There's obviously something to be said for not consulting a poll for every single decision an executive makes. I think when people vote for a president, mayor, or governor, my hope are not simply electing someone who will agree with me 95 percent of the time, but that I am electing someone who reflects their baseline values.
And there are obviously some choices that simply cannot be submitted to popular opinion. Even that sort of prohibition is complicated. We might assume that in 1860, a majority of the public would have supported slavery. But how do we reconcile that with the fact that South Carolina, which initiated the Civil War, was the least democratic state in the old union As early as 1917, a majority of the House and Senate was prepared to pass an anti-lynching bill. Democracy didn't kill the anti-lynching bill, the filibuster did.
When I started writing this post I was going to point out that George W. Bush had plenty of public support for Iraq invasion. The reality is more complicated, and had the truth been known about WMD, public support would have likely plummeted. The idea that "politics" and "public opinion" are nuisances to be trampled upon by the philosopher-kings proceeds from the basic belief that the people are stupid (or easily duped by "powerful interests") and that the obviously correct solution should immediately prevail. You see this kind of anti-democratic instinct in school reform -- Michelle Rhee's contention that she wasn't in the business of "politics," or Bloomberg's appointment of Cathie Black as schools chancellor.
There's something else here also -- there's no real track record. Anti-democrats -- despite their insistence on empiricism -- are often just as addled as the public. For every smoking ban, there's a Cathie Black. Black's appointment was not the result of an infallible algorithm designed to compute the best interest of New York students. It was the result, by Bloomberg's own account, of a desire find someone who "came from out of left field." The appointment was a disaster. But, according to Bloomberg, it's not because he foolishly appointed someone who had no history in education, it's because she was "dumped on in the newspaper from day one." (Powerful interests!) There's always an available excuse for the technocrat.
Likewise, there is no empirical proof that stop and frisk is responsible for New York's drop in crime. But this does not stop Bloomberg from claiming it anyway, then fuming because "nobody" is talking about crime in minority neighborhoods. In fact, minorities have been talking about since the days of "Self-Destruction" (the song is literally called "Self-Destruction.") Disagree? By Bloomberg's lights you are a "racist" who's attempting to divide the city.
Last week in class we read Elizabeth Alexander's wonderful poem "The Venus Hottentot." Reading that piece got me thinking about how tempting it is to adopt the mask of science and empiricism to conceal less noble motivations. Such as ego. When Bloomberg calls Bill De Blasio's campaign "racist" or claims that he should be frisking more black people, I'm not convinced his making a real claim. The content of the words are beside the point. Even as Bloomberg has full-throatedly defended stop and frisk, he's scaled it back. But he can't bear to say that publicly and thus concede a point to those whom he feels are besieging him. Michael Bloomberg's feelings are hurt and he wants to hurt back.
This is not about numbers. There are no numbers that support branding random mosques as "terror enterprises." But for Bloomberg technocracy means the right to tell us that the numbers mean what he says they mean.
A rock structure, built deep underground, is one of the earliest hominin constructions ever found.
In February 1990, thanks to a 15-year-old boy named Bruno Kowalsczewski, footsteps echoed through the chambers of Bruniquel Cave for the first time in tens of thousands of years.
The cave sits in France’s scenic Aveyron Valley, but its entrance had long been sealed by an ancient rockslide. Kowalsczewski’s father had detected faint wisps of air emerging from the scree, and the boy spent three years clearing away the rubble. He eventually dug out a tight, thirty-meter-long passage that the thinnest members of the local caving club could squeeze through. They found themselves in a large, roomy corridor. There were animal bones and signs of bear activity, but nothing recent. The floor was pockmarked with pools of water. The walls were punctuated by stalactites (the ones that hang down) and stalagmites (the ones that stick up).
Washington voters handed Hillary Clinton a primary win, symbolically reversing the result of the state caucus where Bernie Sanders prevailed.
Washington voters delivered a bit of bad news for Bernie Sanders’s political revolution on Tuesday. Hillary Clinton won the state’s Democratic primary, symbolically reversing the outcome of the state’s Democratic caucus in March where Sanders prevailed as the victor. The primary result won’t count for much since delegates have already been awarded based on the caucus. (Sanders won 74 delegates, while Clinton won only 27.) But Clinton’s victory nevertheless puts Sanders in an awkward position.
Sanders has styled himself as a populist candidate intent on giving a voice to voters in a political system in which, as he describes it, party elites and wealthy special-interest groups exert too much control. As the primary election nears its end, Sanders has railed against Democratic leaders for unfairly intervening in the process, a claim he made in the aftermath of the contentious Nevada Democratic convention earlier this month. He has also criticized superdelegates—elected officials and party leaders who can support whichever candidate they chose—for effectively coronating Clinton.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Americans persist in thinking that Adam Smith's rules for free trade are the only legitimate ones. But today's fastest-growing economies are using a very different set of rules. Once, we knew them—knew them so well that we played by them, and won. Now we seem to have forgotten
IN Japan in the springtime of 1992 a trip to Hitotsubashi University, famous for its economics and business faculties, brought me unexpected good luck. Like
several other Japanese universities, Hitotsubashi is almost heartbreaking in
its cuteness. The road from the station to the main campus is lined with cherry
trees, and my feet stirred up little puffs of white petals. Students glided
along on their bicycles, looking as if they were enjoying the one stress-free
moment of their lives.
They probably were. In surveys huge majorities of students say that they study
"never" or "hardly at all" during their university careers. They had enough of
that in high school.
I had gone to Hitotsubashi to interview a professor who was making waves. Since
the end of the Second World War, Japanese diplomats and businessmen have acted
as if the American economy should be the model for Japan's own industrial
growth. Not only should Japanese industries try to catch up with America's lead
in technology and production but also the nation should evolve toward a
standard of economic maturity set by the United States. Where Japan's economy
differed from the American model—for instance, in close alliances between
corporations which U.S. antitrust laws would forbid—the difference should be
considered temporary, until Japan caught up.
Speculation about how Ramsay Bolton might die reveals the challenges of devising a cathartic TV death—and illuminates a larger issue facing the series.
Warning: Season 6 spoilers abound.
Ever since Ramsay Bolton revealed himself as Westeros’s villain-in-chief, Game of Thrones fans have wanted him dead. He first appeared in season three disguised as a Northern ally sent to help Theon Greyjoy but quickly turned out to be a lunatic whose appetite for cruelty only grew as the series progressed. (Last year, Atlantic readers voted him the actual worst character on television.) After several colorful and nauseating years of rape, torture, murder, and bad visual puns, speculation about the Bolton bastard’s looming death has reached its peak this sixth season. But “Will Ramsay die this season?” also gives way to a slightly more complicated question: “How should Ramsay die?”
Bernie Sanders is contesting the Democratic primary to the end, just as Hillary Clinton did eight years ago—but that parallel has its limits.
In May of 2008, two Democrats were somehow still fighting over the nomination. The stronger of the two had a comfortable lead in delegates and made calls to unify the party. But the weaker contender, buoyed by a loyal base, refused to give up. It got awkward.
The difference in 2016, of course, is Hillary Clinton’s position in the drama. She played the spoiler eight years ago, refusing to concede to Barack Obama in a primary that dragged into June, to the consternation of party elders. (They were nervously eyeing John McCain, who had pluckily sewn up his nomination by late February). But this year, she is the candidate ascendant, impatient to wrap up this whole Bernie Sanders business and take on Donald Trump.
In an ironic twist, the Republican nominee—the author of many a failed real-estate deal—is trying to use the Clintons’ bad 1978 land purchase against Hillary Clinton
Suddenly it looks like the presidential campaign could turn into a referendum on the 1990s. No, that doesn’t mean you get to vote your opinion on Third Eye Blind. Instead, Donald Trump seems to be determined to dredge up the detritus of the decade to attack Hillary Clinton.
Democrats knew what they were getting with the Clintons—an incredible political powerhouse, and a perpetual whiff of scandal. What they didn’t know, and still don’t, is how bad it will be this time, and how much it will matter.
Now comes one of the first tests. On Monday, Trump released a short video highlighting accusations of rape lodged against Bill Clinton by Kathleen Willey and Juanita Broaddrick. Attacks on Bill Clinton’s scandals are certainly fair game—the former president will find plenty of defenders, but his behavior will not. Whether they will work is a different matter. Hillary Clinton is trying to strike a delicate balance, reminding people why they liked the Clinton years without running as a nostalgia candidate, but she is ultimately the candidate—not her husband. The attacks could also simply remind people of Trump’s own checkered past as both a friend of the Clintons and a subject of sexual-harassment allegations. (I write in more depth about the risks, rewards, and lessons of this strategy here.)
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
The day—a celebration of corporate conformity disguised as a celebration of individuality—helped to bring about the current dominance of “business casual.”
The New York Times ran a story Wednesday announcing “The End of the Office Dress Code.” The suit and its varied strains, the article argues—corporate uniforms that celebrate, well, corporate uniformity—are giving way to more individualized interpretations of “office attire.” As the writer Vanessa Friedman puts it, “We live in a moment in which the notion of a uniform is increasingly out of fashion, at least when it comes to the implicit codes of professional and public life.”
It’s true. We live in a time in which our moguls dress in hoodies and t-shirts, and in which more and more workers are telecommuting—working not just from home, but from PJs. It’s a time, too, when the lines between “work” and “everything else” are increasingly—and sometimes frustratingly—fluid. And so: It’s also a time when many of us are trying to figure out, together, what “work clothes” actually means, and the extent to which the term might vary across professions. As Emma McClendon, who curated a new exhibit on uniforms for the Museum at the Fashion Institute of Technology, summed it up: “We are in a very murky period.”
What’s harder to believe: that it took a year for Andrea Constand to accuse the star of sexual assault, or that it’s taken 11 years and dozens more women coming forward for those accusations to be heard in court?
To date, more than 50 women have accused Bill Cosby of sexual misconduct. Constand was the first. In January of 2005 she told police that a year earlier, Cosby had touched and penetrated her after drugging her. A prosecutor decided against proceeding with the case, and Constand followed up with a civil suit that resulted in a 2006 settlement. After that came an accelerating drip of women making allegations about incidents spanning a wide swath of Cosby’s career, from Kristina Ruehli (1965) to Chloe Goins (2008).