Pinterest's user experience has drawn most of the attention, but the data users generate are what's really interesting.
Ben Silbermann (left) being interviewed at SXSW (flickr/pantavila).
Ben Silbermann is quiet, reserved even. When I arrived at GigaOm's Roadmap conference this week, he wasn't in the speakers room BSing with the journalists and entrepeneurs there. Instead, he was sitting quietly backstage watching Om Malik interview Evan Williams under the bright lights on a small monitor. When I asked him how he was doing, he told me about life with his infant. We both watched a clock count down to the moment when we had to go on.
I mention all this because Ben Silbermann doesn't do a ton of public appearances, or even interviews with journalists. Which means when you've got the guy there and willing to answer questions, it's exciting. In the spring, there were dozens of stories about Pinterest. That's dialed back in the past few months (aside from Fast Company's excellent feature), but Pinterest just keeps growing and growing.
By now, most people are familiar with the company's mechanic. You can decompose any web page into its constituent images and pin them to one of your "boards." That's the user side of the experience and it's very, very slick. Silbermann contends that Pinterest's core value is that it lets users plan their futures, unlike Facebook (organizing your past) or Twitter (narrating your present). That's how he sees his product fitting into people's lives, he told me.
I opened the interview with perhaps too much of a focus on the demographics of Pinterest. You have almost certainly heard that Pinterest has more female users than male ones. But it's also more Midwestern than your average young web product. I'm not interested in these facts per se, but I would like to know how and why the network developed. Was there something to the core mechanic that disproportionately appealed to women? Or did they just happen to populate their beta network with a lot of Midwestern women and from that seed sprung this whole interesting tree? Silbermann told me he thought it was a little of both.
The question I was saving up, though, didn't have anything to do with the user experience of Pinterest. All the time I've spent reporting on how companies like Google and Nokia build maps had convinced me that building tools that allow you to structure vast amounts of human knowledge into a machine-readable format is an amazing way to create value. This is what librarians do. And this is what Google Translate does. And it's what the people who make the map software on your phone did. The machines are amazing at using the data, but we're the ones who are good at parsing the logic of the human world.
One of the big tasks in artificial intelligence, for example, is labeling photographs. Both Microsoft and Google have built cutting-edge (and huge!) neural nets that can identify cats in YouTube videos, for example. They are getting better all the time and there have been several step changes in how good they are over the last five years.
But could the big machines separate cats into cute cats and silly cats, or recognize a picture of cross-species animal friends? Not really. And this is something that humans can do effortlessly. We impose categories on things because that is how humans work. And another name for a Pinterest board is a category.
So, if you take this perspective, Pinterest becomes something wholly different. It's a fun game to get users to embed their knowledge about the objects and logic of the human world into a database of photographs.
That's what I really wanted to ask Silbermann about. What's he gonna do with all that beautifully, humanly organized data?
And right as I was winding up to that question, working our way towards it, a fire alarm rang. At first, I told people to hang out in their seats for a few seconds, hoping that it'd switch off immediately. But the clanging went on. And soon, Silbermann and I were making our way down the back stairs and out into the unusually warm night. He had a meeting back down in Palo Alto. I hopped in an Uber car and rode back to BART pinning the images from my day to mental boards: missed opportunities, humans vs. machines, San Francisco summer in November, fire alarms.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
There’s a common perception that women siphon off the wealth of their exes and go on to live in comfort. It’s wrong.
A 38-year-old woman living in Everett, Washington recently told me that nine years ago, she had a well-paying job, immaculate credit, substantial savings, and a happy marriage. When her first daughter was born, she and her husband decided that she would quit her job in publishing to stay home with the baby. She loved being a mother and homemaker, and when another daughter came, she gave up the idea of going back to work.
Seven years later, her husband told her to leave their house, and filed for a divorce she couldn’t afford. “He said he was tired of my medical issues, and unwilling to work on things,” she said, citing her severe rheumatoid arthritis and OCD, both of which she manages with medication. “He kicked me out of my own house, with no job and no home, and then my only recourse was to lawyer up. I’m paying them on credit.” (Some of the men and women quoted in this article have been kept anonymous because they were discussing sensitive financial matters, some of them involving ongoing legal disputes.)
Garry Marshall's patronizing 'holiday anthology' film boasts a star-studded ensemble, but its characters seem barely human.
It’s hard to know where to begin with Mother’s Day, a misshapen Frankenstein of a movie that feels like it escaped the Hallmark headquarters halfway through its creation and rampaged into theaters, trying to teach audiences how to love. The third in Garry Marshall’s increasingly strange “holiday anthology” series, Mother’s Day isn’t the rom-com hodge-podge that Valentine’s Day was, or the bizarre morass of his follow-up New Year’s Eve. But it does inspire the kind of holy terror that you feel all the way down to your bones, or the revolted tingling that strikes one at a karaoke performance gone tragically wrong.
While it’s aiming for frothiness and fun, Mother’s Day is a patronizing and sickly sweet endeavor that widely misses the mark for its entire 118-minute running time (it feels much longer). The audience gets the sense that there are many Big Truths to be learned: that family harmony is important, that it’s good to accept different lifestyles without judgment, that loss is a natural part of the circle of life. But its overall construction—as a work of cinema—always feels a little off. One character gets a life lesson from a clown at a children’s party, and departs with a hearty “Thanks, clown!” Extras wander in the background and deliver halting bits of expositional dialogue like malfunctioning robots. Half of the lines seem to have been recorded post-production and are practically shouted from off-screen to patch over a narrative that makes little sense. Mother’s Day is bad in the regular ways (e.g. the acting and writing), but also in that peculiar way, where it feels as though the film’s creator has never met actual humans before.
In Trump’s aftermath, his enemies on the right will have to take stock and propose a meaningful alternative vision for the GOP’s future.
Donald Trump’s big victories in the Mid-Atlantic primaries don’t represent quite the end of the ballgame—but they come damn close.
And now Donald Trump’s many and fierce opponents in the Republican Party and the conservative movement face the hour of decision. Trump looks ever more certain to be the party nominee. Yet not perhaps since George McGovern in 1972 has a presumptive nominee so signally failed to carry the most committed members of his party with him.
So what happens now to those who regard themselves as party thought-leaders? Do they submit? Or do they continue to resist?
Resistance now means something more—and more dangerous—than tapping out #NeverTrump on Twitter. It means working to defeat Trump even knowing that the almost certain beneficiary will be Hillary Clinton.
Knowing the right people certainly has benefits, but how long do they last?
It would seem a safe bet that when faced with two offers from similarly prestigious companies, a job candidate would, most of the time, end up taking the one with higher pay. But when New York University’s Jason Greenberg and MIT’s Roberto M. Fernandez analyzed over 700 job offers from a cohort of students graduating from elite MBA programs, they found that something other than pay was driving students’ decisions.
In a paper that will soon be published in the journal Sociological Science, Greenberg and Fernandez write that the students were significantly more likely to accept jobs found through networking—done either through alums of their program or their own social connections—even if those jobs came with lower pay than offers arriving through more formal channels, like on-campus recruiting. The choice, the researchers suggest, may be driven by students’ interest in their own career development, and a belief that taking a job with more networking opportunities would give them a professional edge, even if it came at the cost of compensation.
DATE: MAY 1, 1994
FROM: DR. HUNTER S. THOMPSON
SUBJECT: THE DEATH OF RICHARD NIXON: NOTES ON THE PASSING OF AN AMERICAN MONSTER.... HE WAS A LIAR AND A QUITTER, AND HE SHOULD HAVE BEEN BURIED AT SEA.... BUT HE WAS, AFTER ALL, THE PRESIDENT.
"And he cried mightily with a strong voice, saying, Babylon the great is fallen, is fallen, and is become the habitation of devils, and the hold of every foul spirit and a cage of every unclean and hateful bird."
Richard Nixon is gone now, and I am poorer for it. He was the real thing -- a political monster straight out of Grendel and a very dangerous enemy. He could shake your hand and stab you in the back at the same time. He lied to his friends and betrayed the trust of his family. Not even Gerald Ford, the unhappy ex-president who pardoned Nixon and kept him out of prison, was immune to the evil fallout. Ford, who believes strongly in Heaven and Hell, has told more than one of his celebrity golf partners that "I know I will go to hell, because I pardoned Richard Nixon."
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.