Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
With the candidate flailing in the polls, some on the right are wondering if a better version of the man wouldn’t be winning. But that kinder, gentler Trump would’ve lost in the primaries.
Last week, Peggy Noonan argued in the Wall Street Journal that an outsider like Donald Trump could’ve won handily this year, touting skepticism of free trade and immigration, if only he was more sane, or less erratic and prone to nasty insults:
Sane Donald Trump would have looked at a dubious, anxious and therefore standoffish Republican establishment and not insulted them, diminished them, done tweetstorms against them. Instead he would have said, “Come into my tent. It’s a new one, I admit, but it’s yuge and has gold faucets and there’s a place just for you. What do you need? That I be less excitable and dramatic? Done. That I not act, toward women, like a pig? Done, and I accept your critique. That I explain the moral and practical underpinnings of my stand on refugees from terror nations? I’d be happy to. My well-hidden secret is that I love everyone and hear the common rhythm of their beating hearts.” Sane Donald Trump would have given an anxious country more ease, not more anxiety. He would have demonstrated that he can govern himself. He would have suggested through his actions, while still being entertaining, funny and outsize, that yes, he understands the stakes and yes, since America is always claiming to be the leader of the world—We are No. 1!—a certain attendant gravity is required of one who’d be its leader.
A dustup between Megyn Kelly and Newt Gingrich shows why Donald Trump and the Republican Party are struggling to retain the support of women.
The 2016 presidential campaign kicked off in earnest with a clash between Megyn Kelly and Donald Trump over gender and conservatism at the first GOP debate, and now there’s another Kelly moment to bookend the race.
Newt Gingrich, a top Trump surrogate, was on Kelly’s Fox News show Tuesday night, jousting with her in a tense exchange stretching over nearly eight minutes. Things got off to a promising start when Gingrich declared that there were two “parallel universes”—one in which Trump is losing and one in which he is winning. (There is data, at least, to support the existence of the former universe.) After a skirmish over whether polls are accurate, Kelly suggested that Trump had been hurt by the video in which he boasts about sexually assaulting women and the nearly a dozen accusations lodged against him by women since. Gingrich was furious, embarking on a mansplaining riff in which he compared the press to Pravda and Izvestia for, in his view, overcovering the allegations.
A medical debate over the definition of death has led to some gruesome questions about exactly how far life can be stretched.
A few years ago, I was working in the intensive-care unit when an elderly male, pale as chalk, was rushed into one of the empty rooms. He had recently been admitted to the hospital with a brain aneurysm so large that it was threatening to burst. But before he could get surgery, his heart stopped. After almost an hour of CPR failed, the man’s surgeon went to the waiting room to tell his family he didn’t make it.
Fifteen minutes later, as I was managing a patient with a serious infection, a nurse came up to me and said that there was a problem: The dead man had a pulse. I went back to the man’s room and saw a clear, regular rhythm on the heart monitor. His wrist had a thready beat.
The man had experienced something extremely rare: auto-resuscitation, also referred to as the Lazarus effect. Sometimes patients spontaneously recover a pulse after all resuscitative efforts have failed. It’s hypothesized this occurs because of some residual medications floating around in their sera, which provide a final push to their hearts to start beating. Whatever the cause, these patients’ resurrected heartbeats almost always fade again soon.
Services like Tinder and Hinge are no longer shiny new toys, and some users are starting to find them more frustrating than fun.
“Apocalypse” seems like a bit much. I thought that last fall when Vanity Fair titled Nancy Jo Sales’s article on dating apps “Tinder and the Dawn of the ‘Dating Apocalypse’” and I thought it again this month when Hinge, another dating app, advertised its relaunch with a site called “thedatingapocalypse.com,” borrowing the phrase from Sales’s article, which apparently caused the company shame and was partially responsible for their effort to become, as they put it, a “relationship app.”
Despite the difficulties of modern dating, if there is an imminent apocalypse, I believe it will be spurred by something else. I don’t believe technology has distracted us from real human connection. I don’t believe hookup culture has infected our brains and turned us into soulless sex-hungry swipe monsters. And yet. It doesn’t do to pretend that dating in the app era hasn’t changed.
A century ago, widely circulated images and cartoons helped drive the debate about whether women should have the right to vote.
It seems almost farcical that the 2016 presidential campaign has become a referendum on misogyny at a moment when the United States is poised to elect its first woman president.
Not that this is surprising, exactly.
There’s a long tradition of politics clashing spectacularly with perceived gender norms around election time, and the stakes often seem highest when women are about to make history.
Today’s political dialogue—which often merely consists of opposing sides shouting over one another—echoes another contentious era in American politics, when women fought for the right to vote. Then and now, a mix of political tension and new-fangled publishing technology produced an environment ripe for creating and distributing political imagery. The meme-ification of women’s roles in society—in civic life and at home—has been central to an advocacy tradition that far precedes slogans like, “Life’s a bitch, don’t elect one,” or “A woman’s place is in the White House.”
Hillary Clinton and Donald Trump prepare for the final sprint to Election Day.
It’s Thursday, October 27—the election is now less than two weeks away. Hillary Clinton holds a lead against Donald Trump, according to RealClearPolitics’ polling average. We’ll bring you the latest updates from the trail as events unfold. Also see our continuing coverage:
A society that glorifies metrics leaves little room for human imperfections.
A century ago, a man named Frederick Winslow Taylor changed the way workers work. In his book The Principles of Scientific Management, Taylor made the case that companies needed to be pragmatic and methodical in their efforts to boost productivity. By observing employees’ performance and whittling down the time and effort involved in doing each task, he argued, management could ensure that their workers shoveled ore, inspected bicycle bearings, and did other sorts of “crude and elementary” work as efficiently as possible. “Soldiering”—a common term in the day for the manual laborer’s loafing—would no longer be possible under the rigors of the new system, Taylor wrote.
The principles of data-driven planning first laid out by Taylor—whom the management guru Peter Drucker once called the “Isaac Newton … of the science of work”—have transformed the modern workplace, as managers have followed his approach of assessing and adopting new processes that squeeze greater amounts of productive labor from their employees. And as the metrics have become more precise in their detail, their focus has shifted beyond the tasks themselves and onto the workers doing those tasks, evaluating a broad range of their qualities (including their personality traits) and tying corporate carrots and sticks—hires, promotions, terminations—to those ratings.
Trump’s greatest gift to the GOP may be the distraction he’s provided from other party meltdowns.
Even though 2016 appears to be the year of painful, public disqualification from higher office, you may be forgiven for not noticing the extraordinary implosion of New Jersey Governor Chris Christie. After all, the Trump surrogate and White House Transition chair has benefitted from his early endorsement of the Republican presidential nominee in unusual fashion: Christie’s power in the Grand Ole Party has decreased, rather than increased. The likelihood of a plum position in the Trump administration—Attorney General, perhaps, since Christie was spurned as the Republican running mate—is decidedly dim, what with the presently apocalyptic predictions about November 8.
Instead, Trump’s gift to Christie has been shadow: the top Republican’s national meltdown has obscured that of the one-time rising Republican star and sitting New Jersey governor. But make no mistake—Christie’s is a fall of epic proportions, precipitated by an unfathomably petty revenge plot. The contrast of the two, the top-heavy-ness of the fallout compared to the insignificance of the initial transgression, would be comic, were it not so tragic. Remember that in November of 2012, Governor Christie had a 72 percent approval rating. Today, it stands at 21 percent.
Evangelicals at the school are tired of politics—and the party that gave them Trump.
LYNCHBURG, Va.—When Jerry Falwell founded Liberty University in 1971, he dreamed of transforming the United States. As heput it, “We’re turning out moral revolutionaries.”
Forty-five years later, the school formerly known as Liberty Baptist College has become a kingmaker and bellwether in the Republican Party. Politicians routinely make pit stops in Lynchburg; Ted Cruz even launched his ill-fated presidential campaign from Liberty’s campus in March of 2015.
That’s why it was such a big deal when, two weeks ago, a group of Liberty students put out a letter explaining why they’re standing against the Republican presidential nominee. Jerry Falwell Jr., who has run the school since his father died in 2007, announced his support for Donald Trump back in January, and he has since spoken on the candidate’s behalf in interviews and at events. “We are Liberty students who are disappointed with President Falwell’s endorsement and are tired of being associated with one of the worst presidential candidates in American history,” the students wrote. “Donald Trump does not represent our values and we want nothing to do with him.”
Biology textbooks tell us that lichens are alliances between two organisms—a fungus and an alga. They are wrong.
In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.
At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.