Sharing is human. We are social. We communicate. We learn from each other. Our first conversations with people we don't know are anecdote competitions. If in the 15th century everyone had owned a printing press, Europe would have been littered with personal missives and opinions. Cameras were one of the first mass-market story-telling devices, and stories were told. Then: curated, bundled, and shared.
The genius of Facebook has always been its facilitation of sharing. Its pivotal innovation -- the one that inspired its first rash of furious remonstrations -- was the automatic sharing of news feeds between friends. In the Friendster/MySpace world, users could visit their friends' feeds, but they did not receive them passively. Facebook's decision to push these feeds out to users' contacts led to howls about privacy -- and that's what made the service a sensation.
Facebook's role in our world is to lead us where we're headed. We like to share who we are and what we like. We're consumers who pay more for things stamped with particular logos, after all; we shouldn't be taken aback when someone tries to spread that idea. Facebook has been there for almost a decade, guiding us toward a place where displays of what we're doing and where we are become the simple documentations of the life of an average Joe.
The company's biggest struggle has been figuring out how to make money from it. An early effort, Beacon, was a flop. People are happy to share information -- photos, stories, links, videos -- but only information they have carefully selected. Beacon took it upon itself to share information about online purchases and transactions -- and people revolted. It was Facebook's most notable failure, and it stemmed from sharing that didn't derive from the user.
Last year's launch of Open Graph began an exploration of how to work around that. It combined two innovations: the global Like button and the ability of some sites to pull information from Facebook without your agreeing to it. Beacon lite. This met with outcry -- I'm losing control over my information! -- which quickly subsided as it became apparent that the intrusion was minimal. People weren't interested in your Pandora stations, but Facebook cracked the door toward using your information the way it wanted.
Slate's Farhad Manjoo has perhaps the savviest take on the innovations Facebook announced yesterday. In addition to Timeline -- the elegant, deep presentation of a user's Facebook history -- the company revealed that it sought to make sharing information "frictionless," which is to say, automatic. Watch a movie or listen to a song and it gets shared, without the tedium of your clicking anything.
The problem with that, of course, is that it eliminates the curation aspect of our self-presentations. It would be as though I told everyone that I was wearing blue jeans and a somewhat worse-for-wear t-shirt right now in addition to revealing that earlier today I wore a sharp, tailored suit. Both are accurate, but only one is the impression I'd like to leave with people. (The latter.) Talking about the suit is Facebook. Talking about my scrubby jeans is Beacon.
I used to work at Adobe. One summer, the company brought in a number of well-known
artists to work on a project, one of whom was a photographer. Using Photoshop, he cleaned up his photos of the other participants, noting that "a photo is not meant to be a dermatological
record." This is extensible: the image we present to the world is not
meant to include every single bit of information possible. What we share is
selected to be a representation of the ideal we want to project, not a
reflection of who we are. Our curation itself is representative; what we don't
say says something, too. Facebook moving curation from us to its algorithms
means we could lose some of our personality in what we present. It's akin to
putting every photo in a photo album, and letting the album worry about what
But this is incidental. Facebook anticipates -- correctly -- that we want easy processes to share more and more about ourselves. Or, at least, that we will soon. We've always wanted simple ways to scrapbook, and Facebook is poised to be one of the simplest.
Where they may have missed the mark is in taking away our ability to decide what we show.
Senator John McCain and White House Chief of Staff John Kelly offered starkly different visions of service—and of America.
It was a week of powerful speeches. The least memorable, oddly, was delivered by the most naturally gifted speaker, former President Barack Obama at a campaign rally in Virginia. “Our democracy is at stake,” he said, before harking back to the trope of his 2008 campaign: “Yes, we can.” Compelling in the setting, but not special.
Far more powerful was former President George W. Bush’s indictment of Donald Trump that didn’t mention the 45th president by name. It was a cry for freedom as a theme in American policy, a denunciation of “casual cruelty” in American discourse, of “nationalism distorted into nativism,” of isolationism, of attempts to turn American identity away from American ideals and into something darker, driven by “geography or ethnicity, by soil or blood.” In itself it would have been noteworthy.
Three families of fallen servicemembers received next-day UPS letters from President Trump after a turbulent week in which Trump falsely claimed he had called “virtually all” of the families.
The Trump administration is scrambling to defend the president’s characterization of his communications with grieving military families, including rush-delivering letters from the president to the families of servicemembers killed months ago. Donald Trump falsely claimed this week that he had called “virtually” all fallen servicemembers’ families since his time in office.
Timothy Eckels Sr. hadn’t heard anything from President Trump since his son Timothy Eckels Jr. was killed after a collision involving the USS John S. McCain on August 21. But then, on October 20, two days into the controversy over the president’s handling of a condolence call with an American soldier’s widow, Eckels Sr. received a United Parcel Service package dated October 18 with a letter from the White House.
Emma Perrier was deceived by an older man on the internet—a hoax that turned into an unbelievable love story.
Emma Perrier spent the summer of 2015 mending a broken heart, after a recent breakup. By September, the restaurant manager had grown tired of watching The Notebook alone in her apartment in Twickenham, a leafy suburb southwest of London, and decided it was time to get back out there. Despite the horror stories she’d heard about online dating, Emma, 33, downloaded a matchmaking app called Zoosk. The second “o” in the Zoosk logo looks like a diamond engagement ring, which suggested that its 38 million members were seeking more than the one-night stands offered by apps like Tinder.
She snapped the three selfies the app required to “verify her identity.” Emma, who is from a volcanic city near the French Alps, not far from the source of Perrier mineral water, is petite, and brunette. She found it difficult to meet men, especially as she avoided pubs and nightclubs, and worked such long hours at a coffee shop in the city’s financial district that she met only stockbrokers, who were mostly looking for cappuccinos, not love.
A stunning new speculative-fiction book by Naomi Alderman couldn’t be more timely.
One of the most succinct definitions of sexual harassment I’ve read over the past few weeks goes like this: For men, it’s anything they might say to a woman that would make them uncomfortable if it were said to them, but in prison. It’s glib, sure. But it gets at the fundamental imbalance of power that characterizes relationships between men and women. To understand what it’s like for a woman to be catcalled, or harassed, or propositioned, it isn’t enough for men to simply put themselves in that woman’s place. They also have to imagine what it’s like to sense the imminent danger in those interactions—to be weaker than their aggressor in every way, and to have that weakness woven into the fabric of society itself. As the adage often attributed to Margaret Atwood goes, “Men are afraid that women will laugh at them. Women are afraid that men will kill them.”
Monday afternoon, President Trump delivered a press conference from an alternative reality, or perhaps a slightly-less-dark timeline. His relationship with Mitch McConnell is great! They have the votes for Obamacare repeal! The hurricane relief effort in Puerto Rico is a smashing success! Democrats are to blame for GOP divisions on Capitol Hill! These claims range from the highly dubious to the patently false.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
A neuroscientist on how we came to be aware of ourselves.
Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it?
The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species.
Honoring the sacrifice of servicemembers requires understanding why they were put at risk, and demanding that those who did so hold themselves to account.
On Thursday morning, I planned to write a pointed screed decrying President Trump’s propensity to view the military community as a problem he can buy off with a check. Then, on Thursday afternoon, White House Chief of Staff John Kelly, himself a Gold Star father, decried the noxious politicization of the deaths of servicemembers and how we treat their families in the aftermath. His remarks gave me pause, as they were meant to.
“Let’s not let this maybe last thing that’s held sacred in our society, a young man, young woman, going out and giving his or her life for our country, let’s … keep that sacred” he implored, lamenting the ugly and voyeuristic events of the past week.
Who better to set the protocol and define the limits of this sacred space than a father who lost his son in Afghanistan? Unless you’ve walked in his shoes, don’t ask questions.
DeepMind’s new self-taught Go-playing program is making moves that other players describe as “alien” and “from an alternate dimension.”
It was a tense summer day in 1835 Japan. The country’s reigning Go player, Honinbo Jowa, took his seat across a board from a 25-year-old prodigy by the name of Akaboshi Intetsu. Both men had spent their lives mastering the two-player strategy game that’s long been popular in East Asia. Their face-off, that day, was high-stakes: Honinbo and Akaboshi represented two Go houses fighting for power, and the rivalry between the two camps had lately exploded into accusations of foul play.
Little did they know that the match—now remembered by Go historians as the “blood-vomiting game”—would last for several grueling days. Or that it would lead to a grisly end.
Early on, the young Akaboshi took a lead. But then, according to lore, “ghosts” appeared and showed Honinbo three crucial moves. His comeback was so overwhelming that, as the story goes, his junior opponent keeled over and began coughing up blood. Weeks later, Akaboshi was found dead. Historians have speculated that he might have had an undiagnosed respiratory disease.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.