Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Hillary Clinton’s realistic attitude is the only thing that can effect change in today’s political climate.
Bernie Sanders and Ted Cruz have something in common. Both have an electoral strategy predicated on the ability of a purist candidate to revolutionize the electorate—bringing droves of chronic non-voters to the polls because at last they have a choice, not an echo—and along the way transforming the political system. Sanders can point to his large crowds and impressive, even astonishing, success at tapping into a small-donor base that exceeds, in breadth and depth, the remarkable one built in 2008 by Barack Obama. Cruz points to his extraordinarily sophisticated voter-identification operation, one that certainly seemed to do the trick in Iowa.
But is there any real evidence that there is a hidden “sleeper cell” of potential voters who are waiting for the signal to emerge and transform the electorate? No. Small-donor contributions are meaningful and a sign of underlying enthusiasm among a slice of the electorate, but they represent a tiny sliver even of that slice; Ron Paul’s success at fundraising (and his big crowds at rallies) misled many analysts into believing that he would make a strong showing in Republican primaries when he ran for president. He flopped.
Thenew Daily Show host, Trevor Noah, is smooth and charming, but he hasn’t found his edge.
It’s a psychic law of the American workplace: By the time you give your notice, you’ve already left. You’ve checked out, and for the days or weeks that remain, a kind of placeholder-you, a you-cipher, will be doing your job. It’s a law that applies equally to dog walkers, accountants, and spoof TV anchormen. Jon Stewart announced that he was quitting The Daily Show in February 2015, but he stuck around until early August, and those last months had a restless, frazzled, long-lingering feel. A smell of ashes was in the air. The host himself suddenly looked quite old: beaky, pique-y, hollow-cheeky. For 16 years he had shaken his bells, jumped and jangled in his little host’s chair, the only man on TV who could caper while sitting behind a desk. Flash back to his first episode as the Daily Show host, succeeding Craig Kilborn: January 11, 1999, Stewart with floppy, luscious black hair, twitching in a new suit (“I feel like this is my bar mitzvah … I have a rash like you wouldn’t believe.”) while he interviews Michael J. Fox.
The championship game descends on a city failing to deal with questions of affordability and inclusion.
SAN FRANCISCO—The protest kicked off just a few feet from Super Bowl City, the commercial playground behind security fences on the Embarcadero, where football fans were milling about drinking beer, noshing on $18 bacon cheeseburgers, and lining up for a ride on a zip line down Market Street.
The protesters held up big green camping tents painted with slogans such as “End the Class War” and “Stop Stealing Our Homes,” and chanted phrases blaming San Francisco Mayor Ed Lee for a whole range of problems, including the catchy “Hey Hey, Mayor Lee, No Penalty for Poverty.” They blocked the sidewalk, battling with tourists, joggers, and city workers, some of whom were trying to wheel their bikes through the crowd to get to the ferries that would take them home.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A series of experiments in mice has led to what some are calling “one of the more important aging discoveries ever."
I'm looking at a picture of two mice. The one on the right looks healthy. The one on the left has graying fur, a hunched back, and an eye that's been whitened by cataracts. “People ask: What the hell did you do to the mouse on the left?” says Nathaniel David. “We didn't do anything.” Time did that. The left mouse is just old. The one on the right was born at the same time and is genetically identical. It looks spry because scientists have been subjecting it to an unusual treatment: For several months, they cleared retired cells from its body.
Throughout our lives, our cells accumulate damage in their DNA, which could potentially turn them into tumors. Some successfully fix the damage, while others self-destruct. The third option is to retire—to stop growing or dividing, and enter a state called senescence. These senescent cells accumulate as we get older, and they have been implicated in the health problems that accompany the aging process.
What happened when 11 exiles armed themselves for a violent night in the Gambia
In the dark hours of the morning on December 30, 2014, eight men gathered in a graveyard a mile down the road from the official residence of Yahya Jammeh, the president of the Gambia. The State House overlooks the Atlantic Ocean from the capital city of Banjul, on an island at the mouth of the Gambia River. It was built in the 1820s and served as the governor’s mansion through the end of British colonialism, in 1965. Trees and high walls separate the house from the road, obscuring any light inside.
The men were dressed in boots and dark pants, and as two of them stood guard, the rest donned Kevlar helmets and leather gloves, strapped on body armor and CamelBaks, and loaded their guns. Their plan was to storm the presidential compound, win over the military, and install their own civilian leader. They hoped to gain control of the country by New Year’s Day.
Revelers in Brazil prepare for Carnival, Dutch police train an eagle to capture drones in midair, a captured monkey menace in Mumbai, Russian infantry training with reindeer, a mock zebra escape in a Tokyo zoo, and much more.
Revelers in Brazil prepare for Carnival, Dutch police train an eagle to capture drones in midair, a captured monkey menace in Mumbai, a kiss after a fight in London, Russian infantry training with reindeer, a mock zebra escape in a Tokyo zoo, a crane collapse in New York, and much more.
U.S. presidential candidates are steering the country toward a terror trap.
For close to a decade, the trauma of the Iraq War left Americans wary of launching new wars in the Middle East. That caution is largely gone. Most of the leading presidential candidates demand that the United States escalate its air war in Iraq and Syria, send additional Special Forces, or enforce a buffer zone, which the head of Central Command, General Lloyd Austin, has said would require deploying U.S. ground troops. Most Americans now favor doing just that.
The primary justification for this new hawkishness is stopping the Islamic State, or isis, from striking the United States. Which is ironic, because at least in the short term, America’s intervention will likely spark more terrorism against the United States, thus fueling demands for yet greater military action. After a period of relative restraint, the United States is heading back into the terror trap.
The country has experienced nursing shortages for decades, but an aging population means the problem is about to get much worse.
Five years ago, my mother was rushed to the hospital for an aneurysm. For the next two weeks, my family and I sat huddled around her bed in the intensive-care unit, oscillating between panic, fear, uncertainty, and exhaustion.
It was nurses that got us through that time with our sanity intact. Nurses checked on my mother—and us—multiple times an hour. They ran tests, updated charts, and changed IVs; they made us laugh, allayed our concerns, and thought about our comfort. The doctors came in every now and then, but the calm dedication of the nurses was what kept us together. Without them, we would have fallen apart.
Which is just one reason why the prospect of a national nursing shortage is so alarming. The U.S. has been dealing with a nursing deficit of varying degrees for decades, but today—due to an aging population, the rising incidence of chronic disease, an aging nursing workforce, and the limited capacity of nursing schools—this shortage is on the cusp of becoming a crisis, one with worrying implications for patients and health-care providers alike.
I coined the term—now I’ve come back to fix what I started.
O reader, hear my plea: I am the victim of semantic drift.
Four months ago, I coined the term “Berniebro” to describe a phenomenon I saw on Facebook: Men, mostly my age, mostly of my background, mostly with my political beliefs, were hectoring their friends about how great Bernie was even when their friends wanted to do something else, like talk about the NBA.
In the post, I tried to gently suggest that maybe there were other ways to advance Sanders’s beliefs, many of which I share. I hinted, too, that I was not talking about every Sanders supporter. I did this subtly, by writing: “The Berniebro is not every Sanders supporter.”
Then, 28,000 people shared the story on Facebook. The Berniebro was alive! Immediately, I started getting emails: Why did I hate progressivism? Why did I joke about politics? And how dare I generalize about every Bernie Sanders supporter?