Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
What J.R.R. Tolkien’s classic The Hobbit still has to offer, 80 years after its publication
“In a hole in the ground there lived a hobbit.” So began the legendarium that dominated a genre, changed Western literature and the field of linguistics, created a tapestry of characters and mythology that endured four generations, built an anti-war ethos that endured a World War and a Cold War, and spawned a multibillion-dollar media franchise. J.R.R. Tolkien’s work is probably best remembered today by the sword-and-sandal epic scale of The Lord of The Rings films, but it started in the quiet, fictionalized English countryside of the Shire. It started, 80 years ago in a hobbit-hole, with Bilbo Baggins.
Although Tolkien created the complicated cosmological sprawl of The Silmarillion and stories like the incestuous saga of Túrin Turambar told in The Children of Húrin, Middle-earth itself is mostly remembered today as something akin to little Bilbo in his Hobbit-hole: quaint, virtuous, and tidy. Nowadays, George R.R. Martin’s got the market cornered on heavily initialed fantasy writers, and his hand guides the field. High and epic fantasy are often expected to dip heavily into the medieval muck of realism, to contain heavy doses of sex and curses, gore and grime, sickness and believable motives and set pieces. Characters like Martin’s mercenary Bronn of the Blackwater are expected to say “fuck.” Modern stories, even when set in lands like A Song of Ice and Fire’s Essos that are filled with competing faiths, tend toward the nihilist, and mostly atheist. Heavenly beings are denuded of potency and purity; while the gods may not be dead, divinity certainly is.
The president’s latest comments shouldn’t be surprising—but his deliberate inflammation of tense situations is no less stunning.
During last year’s presidential campaign, I conducted a running feature called the “Trump Time Capsule.” Its purpose was to chronicle the things Donald Trump said or did that were entirely outside the range of previous presidents or major-party nominees. This, in turn, was meant to lay down a record of what was known about this man, as the electorate decided whether to elevate him to presidential power.
By the time the campaign ended, the series had reached installment #152. Who Donald Trump was, and is, was absolutely clear by election day: ignorant, biased, narcissistic, dishonest. As Ta-Nehisi Coates argues in our current issue, everyone who voted for him did so with ample evidence about the kind of person they considered the “better” choice, or even as a minimally acceptable choice for president. Almost nothing Trump has done since taking office should come as a surprise.
At a game played in London on Sunday afternoon, many of their fellow Ravens and Jaguars took a knee.
Before the Lions met the Falcons in Detroit on Sunday, Rico LaVelle sang “The Star-Spangled Banner.” And then he took a knee.
They were replicating the gesture of Colin Kaepernick, the former 49ers quarterback who, starting in 2016, had been kneeling during the pre-game singing of the national anthem. “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color,” Kaepernick explained. “To me, this is bigger than football and it would be selfish on my part to look the other way. There are bodies in the street and people getting paid leave and getting away with murder.” Kaepernick’s 49ers teammates, Eric Reid and Eli Harold, took a knee. The Beaumont Bulls, a high school team, took a knee. Their collective protests, however, had been limited—deviations from the norm.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Colin Kaepernick and other athletes have a better claim on the United States’s symbols and their meaning.
President Trump apparently slept on it overnight and woke up early on Sunday morning thinking: “Yes, I will fight a cultural war against black athletes.”
In two Sunday morning tweets, Trump urged a boycott of the National Football League until owners punished players who refused to stand for the national anthem, in protest of police brutality and racial injustice—capping a weekend of taunting and trash-talking that began at his Alabama rally Friday night. He’s now created a situation in which it will seem almost unmanly for black athletes, and not only football players, not to take a knee during the anthem. If they stand for the anthem, they will seem to do so at Trump’s command. How can they not resist?
Thirty minutes. That’s about how long it would take a nuclear-tipped intercontinental ballistic missile (ICBM) launched from North Korea to reach Los Angeles. With the powers in Pyongyang working doggedly toward making this possible—building an ICBM and shrinking a nuke to fit on it—analysts now predict that Kim Jong Un will have the capability before Donald Trump completes one four-year term.
About which the president has tweeted, simply, “It won’t happen!”
Though given to reckless oaths, Trump is not in this case saying anything that departs significantly from the past half century of futile American policy toward North Korea. Preventing the Kim dynasty from having a nuclear device was an American priority long before Pyongyang exploded its first nuke, in 2006, during the administration of George W. Bush. The Kim regime detonated four more while Barack Obama was in the White House. In the more than four decades since Richard Nixon held office, the U.S. has tried to control North Korea by issuing threats, conducting military exercises, ratcheting up diplomatic sanctions, leaning on China, and most recently, it seems likely, committing cybersabotage.
A new film details the reason the star postponed her recent tour—and will test cultural attitudes about gender, pain, and pop.
“Pain without a cause is pain we can’t trust,” the author Leslie Jamison wrote in 2014. “We assume it’s been chosen or fabricated.”
Jamison’s essay “Grand Unified Theory of Female Pain” unpacked the suffering-woman archetype, which encompasses literature’s broken hearts (Anna Karenina, Miss Havisham) and society’s sad girls—the depressed, the anorexic, and in the 19th century, the tubercular. Wariness about being defined by suffering, she argued, had led many modern women to adopt a new pose. She wrote, “The post-wounded woman conducts herself as if preempting certain accusations: Don’t cry too loud; don’t play victim.” Jamison questioned whether this was an overcorrection. “The possibility of fetishizing pain is no reason to stop representing it,” she wrote. “Pain that gets performed is still pain.”
One reason the president cannot resist commenting on every issue in American life is that he seemingly cannot stand the actual work of American politics.
In a flurry of comments historically unsuited to any head of state, yet hardly shocking for the current American president, Donald Trump this weekend targeted the two most popular sports in the country and elicited sharp criticism from some of their most important figures.
On Friday, Trump encouraged franchise owners in the National Football League to fire players who protest during the national anthem. “Wouldn’t you love to see one of these NFL owners, when somebody disrespects our flag, to say, ‘Get that son of a bitch off the field right now, out. He’s fired,’” the president said at an Alabama rally.
Trump’s comment provoked Roger Goodell, the typically reticent commissioner of the NFL, to issue a strong statement condemning the president’s divisive language. The comment was particularly surprising, since most NFL owners who elect the league commissioner are staunch Republicans. Many of the most prominent owners donated to the Trump campaign.
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.