Sharing is human. We are social. We communicate. We learn from each other. Our first conversations with people we don't know are anecdote competitions. If in the 15th century everyone had owned a printing press, Europe would have been littered with personal missives and opinions. Cameras were one of the first mass-market story-telling devices, and stories were told. Then: curated, bundled, and shared.
The genius of Facebook has always been its facilitation of sharing. Its pivotal innovation -- the one that inspired its first rash of furious remonstrations -- was the automatic sharing of news feeds between friends. In the Friendster/MySpace world, users could visit their friends' feeds, but they did not receive them passively. Facebook's decision to push these feeds out to users' contacts led to howls about privacy -- and that's what made the service a sensation.
Facebook's role in our world is to lead us where we're headed. We like to share who we are and what we like. We're consumers who pay more for things stamped with particular logos, after all; we shouldn't be taken aback when someone tries to spread that idea. Facebook has been there for almost a decade, guiding us toward a place where displays of what we're doing and where we are become the simple documentations of the life of an average Joe.
The company's biggest struggle has been figuring out how to make money from it. An early effort, Beacon, was a flop. People are happy to share information -- photos, stories, links, videos -- but only information they have carefully selected. Beacon took it upon itself to share information about online purchases and transactions -- and people revolted. It was Facebook's most notable failure, and it stemmed from sharing that didn't derive from the user.
Last year's launch of Open Graph began an exploration of how to work around that. It combined two innovations: the global Like button and the ability of some sites to pull information from Facebook without your agreeing to it. Beacon lite. This met with outcry -- I'm losing control over my information! -- which quickly subsided as it became apparent that the intrusion was minimal. People weren't interested in your Pandora stations, but Facebook cracked the door toward using your information the way it wanted.
Slate's Farhad Manjoo has perhaps the savviest take on the innovations Facebook announced yesterday. In addition to Timeline -- the elegant, deep presentation of a user's Facebook history -- the company revealed that it sought to make sharing information "frictionless," which is to say, automatic. Watch a movie or listen to a song and it gets shared, without the tedium of your clicking anything.
The problem with that, of course, is that it eliminates the curation aspect of our self-presentations. It would be as though I told everyone that I was wearing blue jeans and a somewhat worse-for-wear t-shirt right now in addition to revealing that earlier today I wore a sharp, tailored suit. Both are accurate, but only one is the impression I'd like to leave with people. (The latter.) Talking about the suit is Facebook. Talking about my scrubby jeans is Beacon.
I used to work at Adobe. One summer, the company brought in a number of well-known
artists to work on a project, one of whom was a photographer. Using Photoshop, he cleaned up his photos of the other participants, noting that "a photo is not meant to be a dermatological
record." This is extensible: the image we present to the world is not
meant to include every single bit of information possible. What we share is
selected to be a representation of the ideal we want to project, not a
reflection of who we are. Our curation itself is representative; what we don't
say says something, too. Facebook moving curation from us to its algorithms
means we could lose some of our personality in what we present. It's akin to
putting every photo in a photo album, and letting the album worry about what
But this is incidental. Facebook anticipates -- correctly -- that we want easy processes to share more and more about ourselves. Or, at least, that we will soon. We've always wanted simple ways to scrapbook, and Facebook is poised to be one of the simplest.
Where they may have missed the mark is in taking away our ability to decide what we show.
My view on the Hillary Clinton email “scandal,” as expressed over the months and also yesterday, is that this is another Whitewater. By which I mean: that the political and press hubbub, led in each case on the press’s side by the New York Times, bears very little relationship to the asserted underlying offense, and that after a while it’s hard for anyone to explain what the original sin / crime / violation was in the first place.
The Whitewater investigation machine eventually led, through a series of Rube Goldberg / Jorge Luis Borges-style weirdnesses, to the impeachment of Bill Clinton, even though the final case for removing him from office had exactly nothing to do with the original Whitewater complaint. Thus it stands as an example of how scandals can take on a zombie existence of their own, and of the damage they can do. The Hillary Clinton email “scandal” has seemed another such case to me, as Trey Gowdy’s committee unintentionally demonstrated with its 11-hour attempted takedown of Clinton last year.
Hillary Clinton’s realistic attitude is the only thing that can effect change in today’s political climate.
Bernie Sanders and Ted Cruz have something in common. Both have an electoral strategy predicated on the ability of a purist candidate to revolutionize the electorate—bringing droves of chronic non-voters to the polls because at last they have a choice, not an echo—and along the way transforming the political system. Sanders can point to his large crowds and impressive, even astonishing, success at tapping into a small-donor base that exceeds, in breadth and depth, the remarkable one built in 2008 by Barack Obama. Cruz points to his extraordinarily sophisticated voter-identification operation, one that certainly seemed to do the trick in Iowa.
But is there any real evidence that there is a hidden “sleeper cell” of potential voters who are waiting for the signal to emerge and transform the electorate? No. Small-donor contributions are meaningful and a sign of underlying enthusiasm among a slice of the electorate, but they represent a tiny sliver even of that slice; Ron Paul’s success at fundraising (and his big crowds at rallies) misled many analysts into believing that he would make a strong showing in Republican primaries when he ran for president. He flopped.
Thenew Daily Show host, Trevor Noah, is smooth and charming, but he hasn’t found his edge.
It’s a psychic law of the American workplace: By the time you give your notice, you’ve already left. You’ve checked out, and for the days or weeks that remain, a kind of placeholder-you, a you-cipher, will be doing your job. It’s a law that applies equally to dog walkers, accountants, and spoof TV anchormen. Jon Stewart announced that he was quitting The Daily Show in February 2015, but he stuck around until early August, and those last months had a restless, frazzled, long-lingering feel. A smell of ashes was in the air. The host himself suddenly looked quite old: beaky, pique-y, hollow-cheeky. For 16 years he had shaken his bells, jumped and jangled in his little host’s chair, the only man on TV who could caper while sitting behind a desk. Flash back to his first episode as the Daily Show host, succeeding Craig Kilborn: January 11, 1999, Stewart with floppy, luscious black hair, twitching in a new suit (“I feel like this is my bar mitzvah … I have a rash like you wouldn’t believe.”) while he interviews Michael J. Fox.
Republicans may have a lock on Congress and the nation’s statehouses—and could well win the presidency—but the liberal era ushered in by Barack Obama is only just beginning.
Over roughly the past 18 months, the following events have transfixed the nation.
In July 2014, Eric Garner, an African American man reportedly selling loose cigarettes illegally, was choked to death by a New York City policeman.
That August, a white police officer, Darren Wilson, shot and killed an African American teenager, Michael Brown, in Ferguson, Missouri. For close to two weeks, protesters battled police clad in military gear. Missouri’s governor said the city looked like a war zone.
In December, an African American man with a criminal record avenged Garner’s and Brown’s deaths by murdering two New York City police officers. At the officers’ funerals, hundreds of police turned their backs on New York’s liberal mayor, Bill de Blasio.
If passion is a job requirement, says the writer Miya Tokumitsu, employees have little room to complain about mistreatment at work.
It’s been said in many places and by many luminaries: Do what you love.
But what does this phrase actually mean?
Miya Tokumitsu, a contributing editor at Jacobin magazine and author of the new book Do What You Love And Other Lies About Success and Happiness, criticizes the pervasiveness of this idea in American work culture. She argues that “doing what you love” has been co-opted by corporate interests, giving employers more power to exploit their workers.
I recently spoke with Tokumitsu about work myths and why we should pay attention to them. The following transcript of our conversation has been edited for clarity.
Bourree Lam: Your book started as an essay, “In the Name of Love,” (which was later republished by Slate) that really touched a nerve with people. What were you talking about in that essay and why are people so drawn to it?
Rand Paul, once viewed as the frontrunner, is leaving the Republican race after never gaining much momentum. So is Rick Santorum.
The story of Rand Paul’s presidential campaign, which he’s suspending today, is one of unfulfilled expectations.
Paul, a first-term senator from Kentucky, entered the race with high hopes. In January 2014, my colleague Peter Beinart deemed him the Republican frontrunner. A few months later, in October, Time named him “the most interesting man in politics.” But voters never seemed to agree, and he limped into Iowa trailing in the polls, and he ended up tallying less than 5 percent there—better than Jeb Bush, but still not a figure that set him up to compete down the road.
It’s understandable why Paul’s presidential prospects once seemed so bright. The nation was in the midst of what appeared to be a “libertarian moment.” Liberals and conservatives alike were joined in their backlash against an overweening security state, revealed by Edward Snowden. Newfound skepticism about the police fit in, too, and Paul was talking about the GOP’s dire need to reach out to minorities like no other candidate. The Tea Party, which had helped him upset an establishment candidate in the Kentucky Senate primary, was still a major force. His 13-hour filibuster of CIA Director John Brennan’s nomination won widespread acclaim. While rivals like Ted Cruz and Marco Rubio either alienated colleagues or flailed, Paul was consolidating the support—unexpectedly—of Mitch McConnell, the powerful Senate majority leader and fellow Kentuckian. Paul was also expected to bring in the organizational energy and know-how that his father, former Representative Ron Paul, had built over many years.
The championship game descends on a city failing to deal with questions of affordability and inclusion.
SAN FRANCISCO—The protest kicked off just a few feet from Super Bowl City, the commercial playground behind security fences on the Embarcadero, where football fans were milling about drinking beer, noshing on $18 bacon cheeseburgers, and lining up for a ride on a zip line down Market Street.
The protesters held up big green camping tents painted with slogans such as “End the Class War” and “Stop Stealing Our Homes,” and chanted phrases blaming San Francisco Mayor Ed Lee for a whole range of problems, including the catchy “Hey Hey, Mayor Lee, No Penalty for Poverty.” They blocked the sidewalk, battling with tourists, joggers, and city workers, some of whom were trying to wheel their bikes through the crowd to get to the ferries that would take them home.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The justices will take on a complicated set of cases related to the birth-control mandate in the Affordable Care Act.
On Friday, the Supreme Court decided to tackle the case of the Little Sisters of the Poor, a group of nuns who believe, along with some priests, a Roman Catholic Archdiocese, and several universities, that the government is compelling them to violate their beliefs. Their claim: The so-called birth-control mandate of the Affordable Care Act places a burden on their religious exercise, even with an accommodation from the government.
The first thing to know about these cases is that they are incredibly complicated. The Court granted cert in seven different cases related to this topic, which means they’ve agreed to hear the questions in those seven cases. All of them have come through different circuit courts in the past months, where they’ve mostly lost. But, these cases are a big deal: They are the latest in a long series of challenges to this portion of the law, the most notable of which was last summer’s Hobby Lobby case, which involved for-profit employers.