Sharing is human. We are social. We communicate. We learn from each other. Our first conversations with people we don't know are anecdote competitions. If in the 15th century everyone had owned a printing press, Europe would have been littered with personal missives and opinions. Cameras were one of the first mass-market story-telling devices, and stories were told. Then: curated, bundled, and shared.
The genius of Facebook has always been its facilitation of sharing. Its pivotal innovation -- the one that inspired its first rash of furious remonstrations -- was the automatic sharing of news feeds between friends. In the Friendster/MySpace world, users could visit their friends' feeds, but they did not receive them passively. Facebook's decision to push these feeds out to users' contacts led to howls about privacy -- and that's what made the service a sensation.
Facebook's role in our world is to lead us where we're headed. We like to share who we are and what we like. We're consumers who pay more for things stamped with particular logos, after all; we shouldn't be taken aback when someone tries to spread that idea. Facebook has been there for almost a decade, guiding us toward a place where displays of what we're doing and where we are become the simple documentations of the life of an average Joe.
The company's biggest struggle has been figuring out how to make money from it. An early effort, Beacon, was a flop. People are happy to share information -- photos, stories, links, videos -- but only information they have carefully selected. Beacon took it upon itself to share information about online purchases and transactions -- and people revolted. It was Facebook's most notable failure, and it stemmed from sharing that didn't derive from the user.
Last year's launch of Open Graph began an exploration of how to work around that. It combined two innovations: the global Like button and the ability of some sites to pull information from Facebook without your agreeing to it. Beacon lite. This met with outcry -- I'm losing control over my information! -- which quickly subsided as it became apparent that the intrusion was minimal. People weren't interested in your Pandora stations, but Facebook cracked the door toward using your information the way it wanted.
Slate's Farhad Manjoo has perhaps the savviest take on the innovations Facebook announced yesterday. In addition to Timeline -- the elegant, deep presentation of a user's Facebook history -- the company revealed that it sought to make sharing information "frictionless," which is to say, automatic. Watch a movie or listen to a song and it gets shared, without the tedium of your clicking anything.
The problem with that, of course, is that it eliminates the curation aspect of our self-presentations. It would be as though I told everyone that I was wearing blue jeans and a somewhat worse-for-wear t-shirt right now in addition to revealing that earlier today I wore a sharp, tailored suit. Both are accurate, but only one is the impression I'd like to leave with people. (The latter.) Talking about the suit is Facebook. Talking about my scrubby jeans is Beacon.
I used to work at Adobe. One summer, the company brought in a number of well-known
artists to work on a project, one of whom was a photographer. Using Photoshop, he cleaned up his photos of the other participants, noting that "a photo is not meant to be a dermatological
record." This is extensible: the image we present to the world is not
meant to include every single bit of information possible. What we share is
selected to be a representation of the ideal we want to project, not a
reflection of who we are. Our curation itself is representative; what we don't
say says something, too. Facebook moving curation from us to its algorithms
means we could lose some of our personality in what we present. It's akin to
putting every photo in a photo album, and letting the album worry about what
But this is incidental. Facebook anticipates -- correctly -- that we want easy processes to share more and more about ourselves. Or, at least, that we will soon. We've always wanted simple ways to scrapbook, and Facebook is poised to be one of the simplest.
Where they may have missed the mark is in taking away our ability to decide what we show.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
DATE: MAY 1, 1994
FROM: DR. HUNTER S. THOMPSON
SUBJECT: THE DEATH OF RICHARD NIXON: NOTES ON THE PASSING
OF AN AMERICAN MONSTER.... HE WAS A LIAR AND A QUITTER,
AND HE SHOULD HAVE BEEN BURIED AT SEA.... BUT HE WAS,
AFTER ALL, THE PRESIDENT.
"And he cried mightily with a strong voice, saying, Babylon the great is fallen, is fallen, and is become the habitation of devils, and the hold of every foul spirit and a cage of every unclean and hateful bird."
Richard Nixon is gone now, and I am poorer for it. He was the real thing -- a political monster straight out of Grendel and a very dangerous enemy. He could shake your hand and stab you in the back at the same time. He lied to his friends and betrayed the trust of his family. Not even Gerald Ford, the unhappy ex-president who pardoned Nixon and kept him out of prison, was immune to the evil fallout. Ford, who believes strongly in Heaven and Hell, has told more than one of his celebrity golf partners that "I know I will go to hell, because I pardoned Richard Nixon."
Garry Marshall's patronizing 'holiday anthology' film boasts a star-studded ensemble, but its characters seem barely human.
It’s hard to know where to begin with Mother’s Day, a misshapen Frankenstein of a movie that feels like it escaped the Hallmark headquarters halfway through its creation and rampaged into theaters, trying to teach audiences how to love. The third in Garry Marshall’s increasingly strange “holiday anthology” series, Mother’s Day isn’t the rom-com hodge-podge that Valentine’s Day was, or the bizarre morass of his follow-up New Year’s Eve. But it does inspire the kind of holy terror that you feel all the way down to your bones, or the revolted tingling that strikes one at a karaoke performance gone tragically wrong.
While it’s aiming for frothiness and fun, Mother’s Day is a patronizing and sickly sweet endeavor that widely misses the mark for its entire 118-minute running time (it feels much longer). The audience gets the sense that there are many Big Truths to be learned: that family harmony is important, that it’s good to accept different lifestyles without judgment, that loss is a natural part of the circle of life. But its overall construction—as a work of cinema—always feels a little off. One character gets a life lesson from a clown at a children’s party, and departs with a hearty “Thanks, clown!” Extras wander in the background and deliver halting bits of expositional dialogue like malfunctioning robots. Half of the lines seem to have been recorded post-production and are practically shouted from off-screen to patch over a narrative that makes little sense. Mother’s Day is bad in the regular ways (e.g. the acting and writing), but also in that peculiar way, where it feels as though the film’s creator has never met actual humans before.
There’s a common perception that women siphon off the wealth of their exes and go on to live in comfort. It’s wrong.
A 38-year-old woman living in Everett, Washington recently told me that nine years ago, she had a well-paying job, immaculate credit, substantial savings, and a happy marriage. When her first daughter was born, she and her husband decided that she would quit her job in publishing to stay home with the baby. She loved being a mother and homemaker, and when another daughter came, she gave up the idea of going back to work.
Seven years later, her husband told her to leave their house, and filed for a divorce she couldn’t afford. “He said he was tired of my medical issues, and unwilling to work on things,” she said, citing her severe rheumatoid arthritis and OCD, both of which she manages with medication. “He kicked me out of my own house, with no job and no home, and then my only recourse was to lawyer up. I’m paying them on credit.” (Some of the men and women quoted in this article have been kept anonymous because they were discussing sensitive financial matters, some of them involving ongoing legal disputes.)
The justices signed off Thursday on a new procedural rule for warrants targeting computers.
The U.S. Supreme Court approved a new rule Thursday allowing federal judges to issue warrants that target computers outside their jurisdiction, setting the stage for a major expansion of surveillance and hacking powers by federal law-enforcement agencies.
Chief Justice John Roberts submitted the rule to Congress on behalf of the Court as part of the justices’ annual package of changes to the Federal Rules of Criminal Procedure. The rules form the basis of every federal prosecution in the United States.
Under Rule 41’s current incarnation, federal magistrate judges can typically only authorize searches and seizures within their own jurisdiction. Only in a handful of circumstances can judges approve a warrant that reaches beyond their territory—if, for example, they allow federal agents to use a tracking device that could move through multiple judicial districts.
When schools ask applicants about their criminal histories, a veneer of campus safety may come at the expense of educational opportunity.
The long-running “Ban the Box” campaign is now gaining ground at colleges and universities. The movement aims to protect job, and now student, applicants from being asked about their criminal histories and was recently bolstered by President Obama, who is taking executive action to ban the practice at federal agencies. Campus officials say the background question helps them learn as much as possible about prospective students and allows them to take steps to keep everyone on campus safe. But opponents say the question—which requires prospective students to check a box if they have criminal histories—is an undue barrier that harms certain groups of students.
Some colleges routinely ask an optional criminal-background question; some schools are compelled to ask it by the state in which they’re located; and, whether intentional or not, more than 600 colleges and universities ask simply because they use Common App to streamline the admissions process. This year, 920,000 unique applicants used Common App to submit 4 million applications, or 4.4 applications per student, according to the organization. The criminal-background question that Common App asks is:
In Trump’s aftermath, his enemies on the right will have to take stock and propose a meaningful alternative vision for the GOP’s future.
Donald Trump’s big victories in the Mid-Atlantic primaries don’t represent quite the end of the ballgame—but they come damn close.
And now Donald Trump’s many and fierce opponents in the Republican Party and the conservative movement face the hour of decision. Trump looks ever more certain to be the party nominee. Yet not perhaps since George McGovern in 1972 has a presumptive nominee so signally failed to carry the most committed members of his party with him.
So what happens now to those who regard themselves as party thought-leaders? Do they submit? Or do they continue to resist?
Resistance now means something more—and more dangerous—than tapping out #NeverTrump on Twitter. It means working to defeat Trump even knowing that the almost certain beneficiary will be Hillary Clinton.