Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Emma Perrier was deceived by an older man on the internet—a hoax that turned into an unbelievable love story.
Emma Perrier spent the summer of 2015 mending a broken heart, after a recent breakup. By September, the restaurant manager had grown tired of watching The Notebook alone in her apartment in Twickenham, a leafy suburb southwest of London, and decided it was time to get back out there. Despite the horror stories she’d heard about online dating, Emma, 33, downloaded a matchmaking app called Zoosk. The second “o” in the Zoosk logo looks like a diamond engagement ring, which suggested that its 38 million members were seeking more than the one-night stands offered by apps like Tinder.
She snapped the three selfies the app required to “verify her identity.” Emma, who is from a volcanic city near the French Alps, not far from the source of Perrier mineral water, is petite, and brunette. She found it difficult to meet men, especially as she avoided pubs and nightclubs, and worked such long hours at a coffee shop in the city’s financial district that she met only stockbrokers, who were mostly looking for cappuccinos, not love.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Two centuries ago, America pioneered a way of thinking that puts human well-being in economic terms.
Money and markets have been around for thousands of years. Yet as central as currency has been to so many civilizations, people in societies as different as ancient Greece, imperial China, medieval Europe, and colonial America did not measure residents’ well-being in terms of monetary earnings or economic output.
In the mid-19th century, the United States—and to a lesser extent other industrializing nations such as England and Germany—departed from this historical pattern. It was then that American businesspeople and policymakers started to measure progress in dollar amounts, tabulating social welfare based on people’s capacity to generate income. This fundamental shift, in time, transformed the way Americans appraised not only investments and businesses but also their communities, their environment, and even themselves.
The White House chief of staff decried the desacralization of military deaths—but it was the president he serves who politicized condolence calls.
White House Chief of Staff John Kelly made some extraordinary remarks during Thursday’s White House briefing. They were extraordinary not only because Kelly seldom speaks on the record to the press and was doing so for the second time in a week, but also for the deeply personal nature of what he said—discussing the death of his son in combat, a topic he has in the past been careful to avoid. Yet Kelly’s defense of President Trump, who is embroiled in a self-inflicted crisis over his condolences for the families of fallen servicemembers, also contained the grain of a strong rebuke to the president.
Kelly began with a description of what happens when a soldier, sailor, marine, or airman or -woman is killed in battle. Then he said:
A small group of programmers wants to change how we code—before catastrophe strikes.
There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.
The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.
The views of religiously conservative Americans no longer dominate U.S. culture or law. How will LGBT supporters engage with their perspectives on sexuality?
Among socially conservative religious leaders, the rapid shift of the cultural and legal status quo on issues of sexuality has created a palpable sense of disorientation. Some, like Russell Moore of the Southern Baptist Convention, have argued that this is a time of clarity for Christians: It will become harder for evangelicals to blend into mass culture, and while that separation comes with costs, it may actually be a boon to the faith. Others, like the writer Rod Dreher, have suggested that conservatives may be forced into a sort of cultural withdrawal, a retreat into self-sustaining communities of people who share similar values.
In a new book, Albert Mohler, the president of the Southern Baptist Theological Seminary, offers a third way: stand up and debate, even on issues that seem to be moving toward an ever-firmer cultural consensus. In some ways, Mohler neatly fits the stereotype of an evangelical leader who has taken up a stand against queerness. He’s white, he’s male, he’s Southern; he makes no apologies for his view that homosexuality is intertwined with sin. But he could also probably ace a Women and Gender Studies seminar. (He even once wrote an essay for The Atlanticon the Cosmopolitan editor Helen Gurley Brown.) In his book, We Cannot Be Silent, he cites sociologists like Jürgen Habermas and discusses television shows like Modern Family. He explores the difference between gender and sex and transgender and intersex.
The Centers for Disease Control and Prevention (CDC) keeps a Most Wanted list for flu viruses. The agency evaluates every potentially dangerous strain, and gives them two scores out of 10—one reflecting how likely they are to trigger a pandemic, and another that measures how bad that pandemic would be. At the top of the list, with scores of 6.5 for emergence and 7.5 for impact, is H7N9.
Influenza viruses come in many flavors—H5N1, H1N1, H3N2, and so on. The H and N refer to two proteins on their surface, and the numbers refer to the versions of those proteins that a particular virus carries. H1N1 was responsible for both the catastrophic pandemic of 1918 that killed millions of people, and the most recent (and much milder) one from 2009. H5N1 is the bird-flu subtype that has been worrying scientists for almost two decades. But H7N9? Until recently, it had flown under the radar.
In western Germany, populations of flying insects have fallen by around 80 percent in the last three decades.
The bottles were getting emptier: That was the first sign that something awful was happening.
Since 1989, scientists from the Entomological Society Krefeld had been collecting insects in the nature reserves and protected areas of western Germany. They set up malaise traps—large tents that funnel any incoming insect upward through a cone of fabric and into a bottle of alcohol. These traps are used by entomologists to collect specimens of local insects, for research or education. “But over the years, [the Krefeld team] realized that the bottles were getting emptier and emptier,” says Caspar Hallmann, from Radboud University.
By analyzing the Krefeld data—1,503 traps, and 27 years of work—Hallmann and his colleagues have shown that most of the flying insects in this part of Germany are flying no more. Between 1989 and 2016, the average weight of insects that were caught between May and October fell by an astonishing 77 percent. Over the same period, the weight of insects caught in the height of summer, when these creatures should be at their buzziest, fell by 82 percent.
In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.
Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.
Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.
By making it harder to punish official misconduct, the justices risk damage to America's republican institutions.
For a few days earlier this month, it looked like the years-long corruption probe targeting New Jersey Senator Bob Menendez would fall apart seven weeks into his trial. At issue was the prosecution’s “stream of benefits” theory, which argues that the steady flow of donations and gifts from a wealthy Florida doctor to the Democratic senator—and the flow of favors from the senator to the doctor—amounted to quid pro quo corruption.
During a hearing last week, Judge William Walls seemed to signal that argument was dead on arrival by citing a recent Supreme Court ruling that has vexed public-corruption investigators across the country. “I frankly don’t think McDonnell will allow that,” Walls told prosecutors, referring to the decision in McDonnell v. United States that fundamentally changed the standard for bribery.