There's a lot to like about this Jonathan Chait retrospective on Michael Bloomberg. Chait's main target is the insane idea that Bloomberg could ever have run for president and won. This notion rests on the idea that Bloomberg is a "centrist" when, in fact, his politics are basically the politics of the Democratic Party. If you can articulate the difference between Michael Bloomberg's politics and, say, Chuck Schumer's or Cory Booker's, I'd love to hear it. The idea of Bloomberg as a "centrist" savior rests on the premise that somewhere in the Senate there is liberal version of Ted Cruz.
But there's something else here that's more telling. Chait quotes David Broder asserting that Bloomberg should run because:
... there is a palpable hunger among the public for someone who will attack the problems facing the country -- the war in Iraq, immigration, energy, health care -- and not worry about the politics.
This is an amazing statement, but it's of a piece with Bloomberg's contention that "people aren't good at describing what is in their own interest." There's obviously something to be said for not consulting a poll for every single decision an executive makes. I think when people vote for a president, mayor, or governor, my hope are not simply electing someone who will agree with me 95 percent of the time, but that I am electing someone who reflects their baseline values.
And there are obviously some choices that simply cannot be submitted to popular opinion. Even that sort of prohibition is complicated. We might assume that in 1860, a majority of the public would have supported slavery. But how do we reconcile that with the fact that South Carolina, which initiated the Civil War, was the least democratic state in the old union As early as 1917, a majority of the House and Senate was prepared to pass an anti-lynching bill. Democracy didn't kill the anti-lynching bill, the filibuster did.
When I started writing this post I was going to point out that George W. Bush had plenty of public support for Iraq invasion. The reality is more complicated, and had the truth been known about WMD, public support would have likely plummeted. The idea that "politics" and "public opinion" are nuisances to be trampled upon by the philosopher-kings proceeds from the basic belief that the people are stupid (or easily duped by "powerful interests") and that the obviously correct solution should immediately prevail. You see this kind of anti-democratic instinct in school reform -- Michelle Rhee's contention that she wasn't in the business of "politics," or Bloomberg's appointment of Cathie Black as schools chancellor.
There's something else here also -- there's no real track record. Anti-democrats -- despite their insistence on empiricism -- are often just as addled as the public. For every smoking ban, there's a Cathie Black. Black's appointment was not the result of an infallible algorithm designed to compute the best interest of New York students. It was the result, by Bloomberg's own account, of a desire find someone who "came from out of left field." The appointment was a disaster. But, according to Bloomberg, it's not because he foolishly appointed someone who had no history in education, it's because she was "dumped on in the newspaper from day one." (Powerful interests!) There's always an available excuse for the technocrat.
Likewise, there is no empirical proof that stop and frisk is responsible for New York's drop in crime. But this does not stop Bloomberg from claiming it anyway, then fuming because "nobody" is talking about crime in minority neighborhoods. In fact, minorities have been talking about since the days of "Self-Destruction" (the song is literally called "Self-Destruction.") Disagree? By Bloomberg's lights you are a "racist" who's attempting to divide the city.
Last week in class we read Elizabeth Alexander's wonderful poem "The Venus Hottentot." Reading that piece got me thinking about how tempting it is to adopt the mask of science and empiricism to conceal less noble motivations. Such as ego. When Bloomberg calls Bill De Blasio's campaign "racist" or claims that he should be frisking more black people, I'm not convinced his making a real claim. The content of the words are beside the point. Even as Bloomberg has full-throatedly defended stop and frisk, he's scaled it back. But he can't bear to say that publicly and thus concede a point to those whom he feels are besieging him. Michael Bloomberg's feelings are hurt and he wants to hurt back.
This is not about numbers. There are no numbers that support branding random mosques as "terror enterprises." But for Bloomberg technocracy means the right to tell us that the numbers mean what he says they mean.
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
George Will is denouncing a GOP that has been ailing for years, but quitting won’t help—an American political party can only be reformed from within.
This past weekend, George Will revealed that he had formally disaffiliated himself from the Republican Party, switching his Maryland voter registration to independent. On Fox News Sunday, the conservative pundit explained his decision: "After Trump went after the 'Mexican' judge from northern Indiana then [House Speaker] Paul Ryan endorsed him, I decided that in fact this was not my party anymore.” For 40 years, George Will defined and personified what it meant to be a thoughtful conservative. His intellect and authority inspired a generation of readers and viewers, myself very much among them.
His departure represents a powerful image of divorce between intellectual conservatism and the new Trump-led GOP. Above all, it raises a haunting question for the many other Republicans and conservatives repelled by the looming nomination of Donald Trump as the Republican candidate for president of the United States: What will you do?
Hillary Clinton wrote something for The Toast today. Are you sobbing yet?
Either you’ll immediately get why this is crazy, or you won’t: Hillary Clinton wrote a thing for The Toast today.
Are you weeping? Did your heart skip a beat? Maybe your reaction was, “What. Whaaaat. WHAT,” or “Aaaaaaahhhhhhh!!!” or “OH MY GOD,” or simply “this is too much goodbye I'm dead now.”
Perhaps your feelings can only be captured in GIF form, as was the case for someone commenting on Clinton’s post under the name Old_Girl:
Reader comments like the ones above are arguably the best part of Clinton’s post, because they highlight just how meaningful hearing directly from Clinton is to The Toast’s community of readers. The Toast is a small but beloved feminist website known for its quirky literary humor. It announced last month it couldn’t afford to continue operating. Friday is its last day of publication.
“This western-front business couldn’t be done again.”
On this first day of July, exactly 100 years ago, the peoples of the British Empire suffered the greatest military disaster in their history. A century later, “the Somme” remains the most harrowing place-name in the annals not only of Great Britain, but of the many former dependencies that shed their blood on that scenic river. The single regiment contributed to the First World War by the island of Newfoundland, not yet joined to Canada, suffered nearly 100 percent casualties that day: Of 801 engaged, only 68 came out alive and unwounded. Altogether, the British forces suffered more than 19,000 killed and more than 38,000 wounded: almost as many casualties in one day as Britain suffered in the entire disastrous battle for France in May and June 1940, including prisoners. The French army on the British right flank absorbed some 1,600 casualties more.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
What percentage graduated from high school and enrolled within a year at a four year institution where they live on campus?
Who are today’s college students?
The answer surprises most people who attended four year universities, according to Jamie Merisotis, President and CEO of Lumina Foundation. Addressing audiences, like the one he spoke to Friday at The Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, he frequently poses this question: “What percentage of students in American higher education today graduated from high school and enrolled in college within a year to attend a four year institution and live on campus?”
Most people guess “between forty and sixty percent,” he said, whereas “the correct answer is five percent.” There is, he argued, “a real disconnect in our understanding of who today’s students are. The influencers––the policy makers, the business leaders, the media––have a very skewed view of who today’s students are.”
There needs to be more nuanced language to describe the expanding demographic of unmarried Americans.
In 1957, a team of psychology professors at the University of Michigan released the results of a survey they had conducted—an attempt to reflect Americans’ attitudes about unmarried people. When it came to the group of adults who remained single by choice, 80 percent of the survey’s respondents—reflecting the language used by the survey’s authors—said they believed that the singletons remained so because they must be “immoral,” “sick,” or “neurotic.”
It’s amazing, and reassuring, how much has changed in such a relatively narrow slice of time. Today, certainly, marriage remains a default economic and social arrangement, particularly after having been won as a right for same-sex couples; today, certainly, those who do not marry still face some latent social stigmas (or, at the very least, requests to explain themselves). But the regressive language of failed morality and psychological pathology when it comes to singledom? That has, fortunately, been replaced by more permissive attitudes.
The trend helps explain Trump and Brexit. What’s next?
On Wednesday, Facebook made an announcement that you’d think would only matter to Facebook users and publishers: It will modify its News Feed algorithm to favor content posted by a user’s friends and family over content posted by media outlets. The company said the move was not about privileging certain sources over others, but about better “connecting people and ideas.”
But Richard Edelman, the head of the communications marketing firm Edelman, sees something more significant in the change: proof of a new “world of self-reference” that, once you notice it, helps explain everything from Donald Trump’s appeal to Britain’s vote to exit the European Union. Elites used to possess outsized influence and authority, Edelman notes, but now they only have a monopoly on authority. Influence largely rests with the broader population. People trust their peers much more than they trust their political leaders or news organizations.
Sharing platforms are meant to scale seamlessly throughout the world, but they’ve faced a different knotty set of rules in nearly every city they’ve colonized.
For years now, Airbnb, the popular home-sharing platform, has featured this line of copy at the end of a company mission statement that mostly pledges to promote a sense of adventure and discovery: “And with world-class customer service and a growing community of users, Airbnb is the easiest way for people to monetize their extra space and showcase it to an audience of millions."
It’s a business model condensed into a coda, casually set off with an “And.” The subtext is that the revenue-making potential of the platform is an afterthought, which implies that its appeal lies in its ease of use. Sign up and rent out your apartment or guest room. It’s easy.
Easy, that is, unless you live in Chicago, where regulations passed last week will require hosts to register with the city, impose a tax on each transaction to pay for the city’s homeless services, and limit the number of apartments that can be rented out in a particular building, depending on its size. Or in San Francisco, Airbnb’s hometown, where a law that went into effect in 2015 limits the total number of days an apartment can be rented out per year and similarly requires hosts to register with the city. (This week, the company, which coincidentally helped draft the 2014 law, decided to sue the city over it.) Months after San Francisco imposed those limits, Santa Monica passed regulations requiring hosts to get business licenses and restricted them from renting out entire properties.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?