Sharing is human. We are social. We communicate. We learn from each other. Our first conversations with people we don't know are anecdote competitions. If in the 15th century everyone had owned a printing press, Europe would have been littered with personal missives and opinions. Cameras were one of the first mass-market story-telling devices, and stories were told. Then: curated, bundled, and shared.
The genius of Facebook has always been its facilitation of sharing. Its pivotal innovation -- the one that inspired its first rash of furious remonstrations -- was the automatic sharing of news feeds between friends. In the Friendster/MySpace world, users could visit their friends' feeds, but they did not receive them passively. Facebook's decision to push these feeds out to users' contacts led to howls about privacy -- and that's what made the service a sensation.
Facebook's role in our world is to lead us where we're headed. We like to share who we are and what we like. We're consumers who pay more for things stamped with particular logos, after all; we shouldn't be taken aback when someone tries to spread that idea. Facebook has been there for almost a decade, guiding us toward a place where displays of what we're doing and where we are become the simple documentations of the life of an average Joe.
The company's biggest struggle has been figuring out how to make money from it. An early effort, Beacon, was a flop. People are happy to share information -- photos, stories, links, videos -- but only information they have carefully selected. Beacon took it upon itself to share information about online purchases and transactions -- and people revolted. It was Facebook's most notable failure, and it stemmed from sharing that didn't derive from the user.
Last year's launch of Open Graph began an exploration of how to work around that. It combined two innovations: the global Like button and the ability of some sites to pull information from Facebook without your agreeing to it. Beacon lite. This met with outcry -- I'm losing control over my information! -- which quickly subsided as it became apparent that the intrusion was minimal. People weren't interested in your Pandora stations, but Facebook cracked the door toward using your information the way it wanted.
Slate's Farhad Manjoo has perhaps the savviest take on the innovations Facebook announced yesterday. In addition to Timeline -- the elegant, deep presentation of a user's Facebook history -- the company revealed that it sought to make sharing information "frictionless," which is to say, automatic. Watch a movie or listen to a song and it gets shared, without the tedium of your clicking anything.
The problem with that, of course, is that it eliminates the curation aspect of our self-presentations. It would be as though I told everyone that I was wearing blue jeans and a somewhat worse-for-wear t-shirt right now in addition to revealing that earlier today I wore a sharp, tailored suit. Both are accurate, but only one is the impression I'd like to leave with people. (The latter.) Talking about the suit is Facebook. Talking about my scrubby jeans is Beacon.
I used to work at Adobe. One summer, the company brought in a number of well-known
artists to work on a project, one of whom was a photographer. Using Photoshop, he cleaned up his photos of the other participants, noting that "a photo is not meant to be a dermatological
record." This is extensible: the image we present to the world is not
meant to include every single bit of information possible. What we share is
selected to be a representation of the ideal we want to project, not a
reflection of who we are. Our curation itself is representative; what we don't
say says something, too. Facebook moving curation from us to its algorithms
means we could lose some of our personality in what we present. It's akin to
putting every photo in a photo album, and letting the album worry about what
But this is incidental. Facebook anticipates -- correctly -- that we want easy processes to share more and more about ourselves. Or, at least, that we will soon. We've always wanted simple ways to scrapbook, and Facebook is poised to be one of the simplest.
Where they may have missed the mark is in taking away our ability to decide what we show.
As I mentioned in this post in late November, and in this followup, and also in a discussion with Diane Rehm on her new podcast series yesterday, Donald Trump’s lies differ from those we have encountered from other national figures, even Richard Nixon and Bill Clinton during their respective impeachments. The difference is that Trump seemingly does not care that evidence is immediately at hand to disprove what he says. If he believes what he’s saying, at least in that moment, why shouldn’t we?
For the record, the latest entry of this sort is the repeated insistence by Trump and his associates that he won a “landslide” or “major” victory. For instance, this was his transition team’s response to reports of Russian attempts to swing the election in his favor:
Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.
During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.
The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.
How Vladimir Putin is making the world safe for autocracy
Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.
Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.
The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.
You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.
This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.
To many white Trump voters, the problem wasn’t her economic stance, but the larger vision—a multi-ethnic social democracy—that it was a part of.
Perhaps the clearest takeaway from the November election for many liberals is that Hillary Clinton lost because she ignored the working class.
In the days after her shocking loss, Democrats complained that Clinton had no jobs agenda. A widely shared essay in The Nationblamed Clinton's "neoliberalism" for abandoning the voters who swung the election. “I come from the white working class,” Bernie Sanders said on CBS This Morning, “and I am deeply humiliated that the Democratic Party cannot talk to where I came from.”
But here is the troubling reality for civically minded liberals looking to justify their preferred strategies: Hillary Clinton talked about the working class, middle class jobs, and the dignity of work constantly. And she still lost.
His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.
Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.
The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.
Why the ingrained expectation that women should desire to become parents is unhealthy
In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.
Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Why extreme wealth makes it hard for people to do better than their parents did.
The numbers are sobering: People born in the 1940s had a 92 percent chance of earning more than their parents did at age 30. For people born in the 1980s, by contrast, the chances were just 50-50.
The finding comes from a new paper out of The Equality of Opportunity Project, a joint research effort of Harvard and Stanford led by the economist Raj Chetty. The paper puts numbers on what many have seen firsthand for years: The American dream—the ability to climb the economic ladder and achieve more than one’s parents did—is less and less a reality with every decade that goes by.
There are two main reasons why today’s 30-somethings have a harder time than their parents did, according to the authors. First, the expansion of the gross domestic product has slowed since the 1950s, when growth was frequently above 5 percent a quarter. That means the economic pie is growing at a slower rate than it once did, so there’s less to go around. Second, the distribution of that growth is more unequal, and more benefits are accruing to those at the top. Those at the bottom, on the other hand, are not able to achieve as big a share as they once did. Their wages are not growing, so they are stuck at the same level as, or below, their parents. “Because incomes have been stagnant for a relatively large proportion of society, it’s harder for people who stay within that chunk to beat their parents in absolute terms,” Robert Manduca, one of the paper’s co-authors, told me.