Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Empty pedestals can offer the same lessons about racism and war that the statues do.
Six years before it would become the inspiration for bloody protests, the Robert E. Lee monument in Charlottesville, Virginia, was vandalized. The 2011 incident capped off my 11-year residency in the small city—where I’d taught high-school history and where my understanding of the legacy of the Civil War was nurtured. There was no better place to teach the Civil War than Charlottesville. Some of the most important battlefields in Richmond, Fredericksburg, and the Shenandoah Valley are within an hour’s drive. But it was the region’s monuments that played a central role in my teaching, and I believed they should be left alone.
I argued my position in an essay for The Atlantic: “For better or for worse, monuments to Confederate heroes are part of our story, but each of us can choose how to engage with these places. We can express outrage over their existence. We can alter them with statements of our own. Or we can let them be, appreciate their aesthetic qualities, and reflect carefully on their history.” I fell short on understanding what they still meant to some in the community. I didn’t realize that so many of my neighbors didn’t need further reflection at all.
The ousted White House chief strategist is back at Breitbart News, and he’s planning to make mischief.
Updated on August 18 at 6:25 p.m. ET
In firing Steve Bannon, President Trump has lost his chief ideologue, the man who channeled his base and advocated for the populist-nationalist policies that helped propel Trump to victory.
But he has gained an unpredictable and potentially troublesome outside ally who has long experience running a media organization, and an even longer list of enemies with whom he has scores to settle both outside the administration and inside. “Steve is now unchained,” said a source close to Bannon. “Fully unchained.”
“He’s going nuclear,” said another friend. “You have no idea. This is gonna be really fucking bad.”
Bannon had in recent days mused about leaving, according to people who have spoken with him; he has expressed to friends that he feels the administration is failing and is a sinking ship. And last week, he told people in a meeting that he would have 10 times more influence outside the White House than inside it.
The scientists are all talking like it’s a sure thing.
On August 21, the “moon” will pass between the Earth and the sun, obscuring the light of the latter. The government agency NASA says this will result in “one of nature’s most awe-inspiring sights.” The astronomers there claim to have calculated down to the minute exactly when and where this will happen, and for how long. They have reportedly known about this eclipse for years, just by virtue of some sort of complex math.
This seems extremely unlikely. I can’t even find these eclipse calculations on their website to check them for myself.
Meanwhile the scientists tell us we can’t look at it without special glasses because “looking directly at the sun is unsafe.”
The aftermath of Charlottesville has brought up important questions about who should be speaking, and who should be listening.
In a 2012 article published in the Public Opinion Quarterly, a group of researchers shared the results of a study they had done in the aftermath of the 2008 U.S. presidential election. The researchers, based on panels with young voters, found that the impression of Sarah Palin that Tina Fey had made famous on Saturday Night Live—“I can see Russia from my house!”—had changed the public’s feeling about the actual vice-presidential candidate. Fey’s jokes, the researchers suggested, had proven comedy’s power, especially in times of question and perhaps also in times of crisis, to shape people’s sense of the world. The jokes had woven themselves into the workings of American democracy. The researchers called it the Fey Effect.
The legend of the Confederate leader’s heroism and decency is based in the fiction of a person who never existed.
The strangest part about the continued personality cult of Robert E. Lee is how few of the qualities his admirers profess to see in him he actually possessed.
Memorial Day has the tendency to conjure up old arguments about the Civil War. That’s understandable; it was created to mourn the dead of a war in which the Union was nearly destroyed, when half the country rose up in rebellion in defense of slavery. This year, the removal of Lee’s statue in New Orleans has inspired a new round of commentary about Lee, not to mention protests on his behalf by white supremacists.
The myth of Lee goes something like this: He was a brilliant strategist and devoted Christian man who abhorred slavery and labored tirelessly after the war to bring the country back together.
“Seeing a partial eclipse bears the same relation to seeing a total eclipse as kissing a man does to marrying him.”
Ever since it was first published in 1982, readers—including this one—have thrilled to “Total Eclipse,” Annie Dillard’s masterpiece of literary nonfiction, which describes her personal experience of a solar eclipse in Washington State. It first appeared in Dillard’s landmark collection, Teaching a Stone to Talk, and was recently republished in The Abundance, a new anthology of her work. The Atlantic is pleased to offer the essay in full, here, until the day after the ‘Great American Eclipse’ on August 21.
It had been like dying, that sliding down the mountain pass. It had been like the death of someone, irrational, that sliding down the mountain pass and into the region of dread. It was like slipping into fever, or falling down that hole in sleep from which you wake yourself whimpering. We had crossed the mountains that day, and now we were in a strange place—a hotel in central Washington, in a town near Yakima. The eclipse we had traveled here to see would occur early in the next morning.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
As the president cuts ties with establishment staffers, and forces out his populist firebrand, what’s left of Trumpism other than white identity politics?
Steve Bannon, the enigmatic but influential strategist who joined Donald Trump’s campaign at a low ebb, helped coax a win in the 2016 election from it, and then won acclaim and hatred as Trump’s eminence grise, is leaving the White House.
It is the latest in a string of senior departures from a White House that—like the Republican Party itself—was split between establishment Republicans and populist outsiders. But Bannon’s exit, following on the heels of those other departures, leaves Trump largely untethered from the Republican Party—and the president’s ideology, never especially defined on most issues, even more up for grabs.
In a statement Friday afternoon, White House Press Secretary Sarah Huckabee Sanders said, “White House Chief of Staff John Kelly and Steve Bannon have mutually agreed today would be Steve's last day. We are grateful for his service and wish him the best.” The New York Times had reported that Trump had told aides he was going to remove Bannon. Rumors of Bannon’s demise have bubbled up repeatedly over Trump’s seven months in office, but each time they proved to be wrong—or at least premature.
Antifa’s activists say they’re battling burgeoning authoritarianism on the American right. Are they fueling it instead?
Since 1907, Portland, Oregon, has hosted an annual Rose Festival. Since 2007, the festival had included a parade down 82nd Avenue. Since 2013, the Republican Party of Multnomah County, which includes Portland, had taken part. This April, all of that changed.
In the days leading up to the planned parade, a group called the Direct Action Alliance declared, “Fascists plan to march through the streets,” and warned, “Nazis will not march through Portland unopposed.” The alliance said it didn’t object to the Multnomah GOP itself, but to “fascists” who planned to infiltrate its ranks. Yet it also denounced marchers with “Trump flags” and “red maga hats” who could “normalize support for an orange man who bragged about sexually harassing women and who is waging a war of hate, racism and prejudice.” A second group, Oregon Students Empowered, created a Facebook page called “Shut down fascism! No nazis in Portland!”